King Alfred Press

Home » Posts tagged 'Europe'

Tag Archives: Europe

The Consequences of Coronavirus

r0_0_1920_1080_w1200_h678_fmax

Like most Australians, I have spent the past few weeks isolated in my home. With stores closed and public events cancelled, many of us have had to find new ways of keeping ourselves entertained. For me, this period of isolation has been spent reading, writing, and reflecting. However, when one is relaxing it can become easy to forget about the outside world. And it is easy to forget that the long-term consequences of Covid-19 will far outweigh any short-term inconveniences we may be suffering.

Economic

After its human victims, the first casualty of Covid-19 will be the health and vitality of the global economy. Nations like Australia have decided, quite rightly, that their most immediate priority is to protect the health of its citizens. The lockdowns, social-distancing, and other measures taken to prevent the spread of Covid-19 have certainly been effective, but they have come with negative economic consequences.

This fact has been recognised by authorities ranging from the Australian Prime Minister to the World Economic Forum. The World Economic Forum has warned that Covid-19 will keep “large parts of the global economy shuttered” through April. This view was reflected by J.P. Morgan who stated that Covid-19 had pushed the world’s economy into a twelve per cent contraction.

Particularly hard hit will be the tourism and hospitality industries. The Asia Conference stated that the negative impacts of the virus are “likely to worsen as the outbreak continues to disrupt tourism, trade, supply chains, and investments in China.” Likewise, the World Travel and Tourism Council has warned that the economic impacts of Covid-19 could wipe out fifty-million jobs in the travel and tourism industries.

Political

The second casualty will be a change in the way much of the world thinks about its relationship with China. It took the Chinese Communist Party a month to be bothered informing the World Health Organisation of the existence of Covid-19. Thanks to their incompetence, the virus has been able to spread beyond China’s borders. Many people will be left asking: can we really trust a government that has proven itself to be so intrinsically untrustworthy?

The Chinese Communist Party’s reaction to negative press hasn’t exactly endeared them, either. Chinese authorities have been quick to clamp down on anyone who contradicts the claim that the Chinese response to the virus has been effective. In one notable case, a post made by Dr. Li Wenliang on WeChat was dismissed as “illegal acts of fabricating, spreading rumours, and disrupting social order” because it claimed that victims of Covid-19 were being quarantined at the hospital he worked at.

China’s attempts to crack down on negative press outside their borders have been less successful In February, Ivo Daadler wrote in the Chicago Tribune that the Chinese government’s secrecy over Covid-19 made the situation worse than it needed to be. “The fact that China chose secrecy and inaction turned the possibility of an epidemic into a reality”, Daadler wrote in his article.

Daadler’s article has been picked up by several publications, including the Korea Herald and the Kathmandu Post, who published it with an illustration of Chairman Mao wearing a surgical mask. The Chinese Embassy in Nepal dismissed the article as “malicious.” The Nepalese press, however, responded to the accusation by accusing the Chinese embassy of making a “direct threat to the Nepali people’s right to a free press.”

Social

The third casualty of Covid-19 will be the globalist philosophy that has dominated politics over the past few years. People have discovered, much to their chagrin, that the spread of Covid-19 has been facilitated by the ideals of openness that globalism espouses. They are discovering that open borders, mass migration, and crowded housing are harbingers of disease. It is very unlikely that people will be as accepting of open borders and high immigration as they once were.

The ability to share products and ideas is a wonderful innovation. However, people must be willing to accept that the transfer of these things from one place to another also comes with the transfer of less palatable things, like crime and disease. And, truth be told, most people aren’t. This fact has not been lost on many of Europe’s right-wing political parties who are now calling for tighter restrictions on borders during the pandemic.

Although the decline in globalism is sorely needed, Covid-19 has also come with an increase in racism and xenophobia, particularly against Asian people. According to Business Insider, instances of racist and xenophobic attacks, ranging from mere verbal abuse to physical assault, have increased with the Covid-19 pandemic. The sad truth is that discrimination and hatred go hand-in-hand with pandemics. If you associate a group of people with a particular disease and then refuse to associate with them you are much less likely to catch that disease yourself.

Conclusion

The long-term consequences of Covid-19 are going to be far more severe than the current inconveniences it poses. Measures to restrict its spread have caused profound economic penalties, especially in the hospitality and tourism sectors, that will take years to heal. Similarly, relations between China and the world have been tarnished by the Communist Party’s vehement attacks against negative (and richly deserved) criticism and their refusal to be honest about the situation. Finally, Covid-19 will see a decline in the popularity of globalism, open border policies, and mass migration. This pandemic has marked the beginning of a brave new world.

On Constitutional Monarchy

i12510

I would like to begin this essay by reciting a poem by the English Romantic poet, William Wordsworth (1770 – 1850):

 

     Milton! thou shouldst be living at this hour:

            England hath need for thee: she is a fen

            Of stagnant waters: altar, sword, and pen,

            Fireside, the heroic wealth of hall and bower,

            Have forfeited their ancient English dower

            Of inward happiness. We are selfish men;

            Oh! raise us up, return to us again;

            And give us manners, virtue, freedom, power.

            Thy soul was like a star, and dwelt apart:

            Thou hadst a voice whose sound was like the sea:

            Pure as the naked heavens, majestic, free

            So didst thou travel on life’s common way,

            In cheerful godliness; and yet thy heart

            The lowliest duties on herself did lay.

 

The poem, entitled London 1802, is Wordsworth’s ode to an older, nobler time. In it he attempts to conjure up the spirit of John Milton (1608 – 1674), the writer and civil servant immortalised for all time as the writer of Paradise Lost.

Milton acts as the embodiment for a nobler form of humanity. He symbolises a time when honour and duty played far greater a role in the human soul than it did in Wordsworth’s time, or even today. It is these themes of honour, duty, and nobility that will provide the spiritual basis for constitutional monarchy.

It is a subject that I will return to much later in this essay. But, to begin, it would perhaps be more prudent to begin this essay in earnest by examining those aspects of English history that allowed both constitutional monarchy and English liberty to be borne.

The English monarchy has existed for over eleven-hundred years. Stretching from King Alfred the Great in the 9th century to Elizabeth II in the 21st, the English people have seen more than their fair share of heroes and villains, wise kings and despotic tyrants. Through their historical and political evolution, the British have developed, and championed, ideals of liberty, justice, and good governance. The English have gifted these ideals to most of the Western World through the importation of their culture to most of the former colonies.

It is a sad reality that there are many people, particularly left-wing intellectuals, who need to reminded of the contributions the English have made to world culture. The journalist, Peter Hitchens (1951 – ) noted in his book, The Abolition of Britain that abhorrence for one’s own country was a unique trait of the English intellectual. Similarly, George Orwell (1903 – 1950) once observed, an English intellectual would sooner be seen stealing from the poor box than standing for “God Save the King.”

However, these intellectuals fail to notice, in their arrogance, that “God save the King” is actually a celebration of constitutional monarchy and not symbolic reverence to an archaic and rather powerless royal family. It is intended to celebrate the nation as embodied in the form of a single person or family and the fact that the common man and woman can live in freedom because there are constitutional restraints placed on the monarch’s power.

If one’s understanding of history has come from films like Braveheart, it is easy to believe that all people in all times have yearned to be free. A real understanding of history, one that comes from books, however, reveals that this has not always been the case. For most of history, people lived under the subjugation of one ruler or another. They lived as feudal serfs, subjects of a king or emperor, or in some other such arrangement. They had little reason to expect such arrangements to change and little motivation to try and change them.

At the turn of the 17th century, the monarchs of Europe began establishing absolute rule by undermining the traditional feudal institutions that had been in place for centuries. These monarchs became all-powerful wielding their jurisdiction over all forms of authority: political, social, economic, and so forth.

To justify their mad dash for power, Europe’s monarchs required a philosophical argument that vindicated their actions. They found it in a political doctrine known as ‘the divine rights of kings.’ This doctrine, formulated by the Catholic Bishop, Jacques Bossuet (1627 – 1704) in his book, Politics Derived from Sacred Scripture, argued that monarchs were ordained by God and therefore represented His will. It was the duty of the people to obey that individual without question. As such, no limitations could be put on a monarch’s power.

What Bossuet was suggesting was hardly a new, but it did provide the justification many monarchs needed to centralise power in themselves. King James I (1566 – 1625) of England and Scotland saw monarchs as God’s lieutenants and believed that their actions should be tempered by the fear of God since they would be called to account at the Last Judgement. On the basis of this belief, King James felt perfectly justified in proclaiming laws without the consent of parliament and involving himself in cases being tried before the court.

When King James died in 1625, he was succeeded by his second-eldest son, Charles (1600 – 1649). King Charles I assumed the throne during a time of political change. He was an ardent believer in the divine rights of kings, a belief that caused friction between the monarch and parliament from whom he had to get approval to raise funds.

In 1629, Charles outraged much of the population, as well as many nobles, when he elected to raise funds for his rule using outdated taxes and fines, and stopped calling parliament altogether. Charles had been frustrated by Parliament’s constant attacks on him and their refusal to furnish him with money. The ensuing period would become known as the eleven years tyranny.

By November 1640, Charles had become so bereft of funds that he was forced to recall Parliament. The newly assembled Parliament immediately began clamouring for change. They asserted the need for a regular parliament and sought changes that would make it illegal for the King to dissolve the political body without the consent of its members. In addition, the Parliament ordered the king to execute his friend and advisor, Thomas Wentworth (1593 – 1641), the 1st Earl of Stafford, for treason.

The result was a succession of civil wars that pitted King Charles against the forces of Parliament, led by the country gentlemen, Oliver Cromwell (1599 – 1658). Hailing from Huntingdon, Cromwell was a descendant of Henry VIII’s (1491 – 1547) chief minister, Thomas Cromwell (1485 – 1550). In the end, it would decimate the English population and forever alter England’s political character.

The English Civil War began in January 1642 when King Charles marched on Parliament with a force of four-hundred-thousand men. He withdrew to Oxford after being denied entry. Trouble was brewing. Throughout the summer, people aligned themselves with either the monarchists or the Parliamentarians.

The forces of King Charles and the forces of Parliament would meet at the Battle of Edgehill in October. What would follow is several years of bitter and bloody conflict.

Ultimately, it was Parliament that prevailed. Charles was captured, tried for treason, and beheaded on January 30th, 1642. England was transformed into a republic or “commonwealth.” The English Civil War had claimed the lives of two-hundred-thousand peoples, divided families, and facilitated enormous social and political change. Most importantly, however, it set the precedent that a monarch could not rule without the consent of parliament.

The powers of parliament had been steadily increasing since the conclusion of the English Civil War. However, total Parliamentary supremacy had proven unpopular. The Commonwealth created in the wake of the Civil War had collapsed shortly after Oliver Cromwell’s death. When this happened, it was decided to restore the Stuart dynasty.

The exiled Prince Charles returned to France and was crowned King Charles II (1630 – 1685). Like his father and grandfather, Charles was an ardent believer in the divine rights of kings. This view put him at odds with those of the Enlightenment which challenged the validity of absolute monarchy, questioned traditional authority, and idealised liberty.

By the third quarter of the 17th century, Protestantism had triumphed in both England and Scotland. Ninety-percent of the British population was Protestant. The Catholic minority was seen as odd, sinister, and, in extreme cases, outright dangerous. People equated Catholicism with tyranny linking French-Style autocracy with popery.

It should come as no surprise, then, that Catholics became the target of persecution. Parliament barred them from holding offices of state and banned Catholic forms of worship. Catholics were barred from becoming members of Parliament, justices of the peace, officers in the army, or hold any other position in Parliament unless they were granted a special dispensation by the King.

It is believed that Charles II may have been a closet Catholic. He was known for pardoning Catholics for crimes (controversial considering Great Britain was a protestant country) and ignoring Parliament.

However, Charles’ brother and successor, James (1633 – 1701) was a Catholic beyond any shadow of a doubt. He had secretly converted in 1669 and was forthright in his faith. After his first wife, Anne Hyde (1637 – 1671) died, James had even married the Italian Catholic, Mary of Modena (1658 – 1718). A decision that hardly endeared him to the populace.

The English people became alarmed when it became obvious that Charles II’s wife, Catherine of Braganza (1638 – 1705) would not produce a Protestant heir. It meant that Charles’ Catholic brother, James was almost certainly guaranteed to succeed him on the throne. So incensed was Parliament at having a Catholic on the throne, they attempted to pass the Crown onto one of Charles’ Anglican relatives.

Their concern was understandable, too. The English people had suffered the disastrous effects of religious intolerance since Henry VIII had broken away from the Catholic Church and established the Church of England. The result had been over a hundred years of religious conflict and persecution. Mary I (1516 – 1558), a devout Catholic, had earnt the moniker “bloody Mary” for burning Protestants the stake. During the reign of King James, Guy Fawkes (1570 – 1606), along with a group of Catholic terrorists, had attempted to blow up Parliament in the infamous “gunpowder plot.”

Unlike Charles II, James made his faith publicly known. He desired greater tolerance for Catholics and non-Anglican dissenters like Quakers and Baptists. The official documents he issued, designed to bring about the end of religious persecution, were met with considerable objection from both Bishops and Europe’s protestant monarchs.

Following the passing of the Test Act in 1672, James had briefly been forced to abandon his royal titles. The Act required officers and members of the nobility to take the Holy Communion as spelt out by the Church of England. It was designed to prevent Catholics from taking public office.

Now, as King, James was attempting to repeal the Test Act by placing Catholics in positions of power. His Court featured many Catholics and he became infamous for approaching hundreds of men – justices, wealthy merchants, and minor landowners – to stand as future MPs and, in a process known as ‘closeting’, attempting to persuade them to support his legal reforms. Most refused.

That was not the limits of James’ activities, either. He passed two Declarations of Indulgences to be read from every stage for two Sundays, and put those who opposed it on trial for seditious libel. Additionally, he had imprisoned seven Bishops for opposing him, made sweeping changes to the Church of England, and built an army comprising mainly of Catholics.

The people permitted James II to rule as long as his daughter, the Protestant Prince Mary (1662 – 1694) remained his heir. All this changed, however, when Mary Modena produced a Catholic heir: James Francis Edward Stuart (1688 – 1766). When James declared that the infant would be raised Catholic, it immediately became apparent that a Catholic dynasty was about to be established. Riots broke out. Conspiracy theorists posited that the child was a pawn in a Popish plot. The child, the theory went, was not the King’s son but rather a substitute who had been smuggled into the birthing chamber in a bed-warming pan.

In reality, it was the officers of the Army and Navy who were beginning to plot and scheme in their taverns and drinking clubs. They were annoyed that James had introduced Papist officers into the military. The Irish Army, for example, had seen much of its Protestant officer corps dismissed and replaced with Catholics who had little to no military experience.

James dissolved Parliament in July 1688. Around this time, a Bishop and six prominent politicians wrote to Mary and her Dutch husband, William of Orange (1650 – 1702) and invited them to raise an army, invade London, and seize the throne. They accepted.

William landed in Dorset on Guy Fawkes’ day accompanied by an army of fifteen-thousand Dutchmen and other Protestant Europeans. He quickly seized Exeter before marching eastward towards London. James II called for troops to confront William.

Things were not looking good for James, however. Large parts of his officer corps were defecting to the enemy and taking their soldiers with them. Without the leadership of their officers, many soldiers simply went home. English magnates started declaring for William. And his own daughter, Princess Anne (1665 – 1714) left Whitehall to join the rebels in Yorkshire. James, abandoned by everyone, fled to exile in France. He would die there twelve-years-later.

On January 22nd, 1689, William called the first ‘convention parliament.’ At this ‘convention’, Parliament passed two resolutions. First, it was decided that James’ flight into exile constituted an act of abdication. And second, it was declared a war against public policy for the throne to be occupied by a Catholic. As such, the throne was passed over James Francis Edward Stuart, and William and Mary were invited to take the Crown as co-monarchs.

They would be constrained, however, by the 1689 Bill of Rights and, later, by the 1701 Act of Settlement. The 1689 Bill of Rights made Great Britain a constitutional monarchy as opposed to an absolute one. It established Parliament, not the crown, as the supreme source of law. And it set out the most basic rights of the people.

Likewise, the 1701 Act of Settlement helped to strengthen the Parliamentary system of governance and secured a Protestant line of succession. Not only did it prevent Catholics from assuming the throne, but it also gave Parliament the ability to dictate who could ascend to the throne and who could not.

The Glorious Revolution was one of the most important events in Britain’s political evolution. It made William and Mary, and all monarchs after them, elected monarchs. It established the concept of Parliamentary sovereignty granting that political body the power to make or unmake any law it chose to. The establishment of Parliamentary sovereignty brought with it the ideas of responsible and representative government.

The British philosopher, Roger Scruton (1944 – ) described British constitutional monarchy as a “light above politics which shines down [on] the human bustle from a calmer and more exalted sphere.” A constitutional monarchy unites the people for a nation under a monarch who symbolises their shared history, culture, and traditions.

Constitutional monarchy is a compromise between autocracy and democracy. Power is shared between the monarch and the government, both of whom have their powers restricted by a written, or unwritten, constitution. This arrangement separates the theatre of power from the realities of power. The monarch is able to represent the nation whilst the politician is able to represent his constituency (or, more accurately, his party).

In the Need for Roots, the French philosopher, Simone Weils (1909 – 1943) wrote that Britain had managed to maintain a “centuries-old tradition of liberty guaranteed by the authorities.” Weils was astounded to find that chief power in the British constitution lay in the hands of a lifelong, unelected monarch. For Weils, it was this arrangement that allowed the British to retain its tradition of liberty when other countries – Russia, France, and Germany, among others – lost theirs when they abolished their monarchies.

sir_isaac_isaacs_and_lady_isaacs

Great Britain’s great legacy is not their once vast and now non-existent Empire, but the ideas of liberty and governance that they have gifted to most of their former colonies. Even the United States, who separated themselves from the British by means of war, inherited most of their ideas about “life, liberty, and the pursuit of happiness” from their English forebears.

The word “Commonwealth” was adopted at the Sixth Imperial Conference held between October 19th and November 26th, 1926. The Conference, which brought together the Prime Ministers of the various dominions of the British Empire, led to the formation of the Inter-Imperial Relations Committee. The Committee, headed for former British Prime Minister, Arthur Balfour (1848 – 1930), was designed to look into future constitutional arrangements within the commonwealth.

Four years later, at the Seventh Imperial Conference, the committee delivered the Balfour Report. It stated:

“We refer to the group of self-governing communities composed of Great Britain and the Dominions. Their position and mutual relation may be readily defined. They are autonomous Communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations.”

It continued:

“Every self-governing member of the Empire is now the master of its destiny. In fact, if not always in form, it is subject to no compulsion whatsoever.”

Then, in 1931, the Parliament of the United Kingdom passed the Statute of Westminster. It became one of two laws that would secure Australia’s political and legal independence from Great Britain.

The Statute of Westminster gave legal recognition to the de-facto independence of the British dominions. Under the law, Australia, Canada, the Irish Free State, Newfoundland (which would relinquish its dominion status and be absorbed into Canada in 1949), New Zealand and South Africa were granted legal independence.

Furthermore, the law abolished the Colonial Validity Act 1865. A law which had been enacted with the intention of removing “doubts as to the validity of colonial laws.” According to the act, a Colonial Law was void when it “is or shall be in any respect repugnant to the provisions of any Act of Parliament extending to the colony to which such laws may relate, or repugnant to any order or regulation under authority of such act of Parliament or having in the colony the force and effect of such act, shall be read subject to such act, or regulation, and shall, to the extent of such repugnancy, but not otherwise, be and remain absolutely void and inoperative.”

The Statute of Westminster was quickly adopted by Canada, South Africa, and the Irish Free State. Australia, on the other hand, did not adopt it until 1942, and New Zealand did not adopt it until 1947.

More than forty-years-later, the Hawke Labor government passed the Australia Act 1986. This law effectively made the Australian legal system independent from Great Britain. It had three major achievements. First, it ended appeals to the Privy Council thereby establishing the High Court as the highest court in the land. Second, it ended the influence the British government had over the states of Australia. And third, it allowed Australia to update or repeal those imperial laws that applied to them by ending British legislative restrictions.

What the law did not do, however, was withdraw the Queen’s status as Australia’s Head of State:

“Her Majesty’s Representative in each State shall be the Governor.

Subject to subsections (3) and (4) below, all powers and functions of Her Majesty in respect of a State are exercisable only by the Governor of the State.

Subsection (2) above does not apply in relation to the power to appoint, and the power to terminate the appointment of, the Governor of a State.

While her Majesty is personally present in a State, Her Majesty is not precluded from exercising any of Her powers and functions in respect of the State that are the subject of subsection (2) above.

The advice of Her Majesty in relation to the exercise of powers and functions of Her Majesty in respect of a State shall be tendered by the Premier of the State.”

These two laws reveal an important miscomprehension that is often exploited by Australian Republicans. That myth is the idea that Australia does not have legal and political independence because its Head of State is the British monarch. The passage of the Statute of Westminster in 1931 and the Australia Act in 1986 effectively ended any real political or legal power the British government had over Australia.

In Australia, the monarch (who is our head of state by law) is represented by a Governor General. This individual – who has been an Australian since 1965 – is required to take an oath of allegiance and an oath of office that is administered by a Justice (typically the Chief Justice) of the High Court. The Governor-General holds his or her position at the Crown’s pleasure with appointments typically lasting five years.

The monarch issues letters patent to appoint the Governor General based on the advice of Australian ministers. Prior to 1924, Governor Generals were appointed on the advice of both the British government and the Australian government. This is because the Governor General at that time represented both the monarch and the British government. This arrangement changed, however, at the Imperial Conferences of 1926 and 1930. The Balfour Report produced by these conferences stated that the Governor General should only be the representative of the crown.

The Governor General’s role is almost entirely ceremonial. It has been argued that such an arrangement could work with an elected Head of State. However, such an arrangement would have the effect of politicising and thereby corrupting the Head of State. A Presidential candidate in the United States, for example, is required to raise millions of dollars for his campaign and often finds himself beholden to those donors who made his ascent possible. The beauty of having an unelected Head of State, aside from the fact that it prevents the government from assuming total power, is that they can avoid the snares that trap other political actors.

image-20151106-16263-1t48s2d

The 1975 Constitutional Crisis is a perfect example of the importance of having an independent and impartial Head of State. The crises stemmed from the Loans Affair which forced Dr. Jim Cairns (1914 – 2003), Deputy Prime Minister, Treasurer, and intellectual leader of the political left, and Rex Connor (1907 – 1977) out of the cabinet. As a consequence of the constitutional crisis, Gough Whitlam (1916 – 2014) was dismissed as Prime Minister and the 24th federal parliament was dissolved.

The Loan’s affair began when Rex Connor attempted to borrow money, up to US$4b, to fund a series of proposed national development projects. Connor deliberately flouted the rules of the Australian Constitution which required him to take such non-temporary government borrowing to the Loan Council (a ministerial council consisting of both Commonwealth and state elements which existed to coordinate public sector borrowing) for approval. Instead, on December 13th, 1974, Gough Whitlam, Attorney-General Lionel Murphy (1922 – 1986), and Dr. Jim Cairns authorised Connor to seek a loan without the council’s approval.

When news of the Loans Affair was leaked, the Liberal Party, led by Malcolm Fraser (1930 – 2015), began questioning the government. Whitlam attempted to brush the scandal aside by claiming that the loans had merely been “matters of energy” and claiming that the Loans Council would only be advised once a loan had been made. Then, on May 21st, Whitlam informed Fraser that the authority for the plan had been revoked.

Despite this, Connor continued to liaise with the Pakistani financial broker, Tirath Khemlani (1920 – 1991). Khemlani was tracked down and interviewed by Herald Journalist, Peter Game (1927 – ) in mid-to-late 1975. Khemlani claimed that Connor had asked for a twenty-year loan with an interest of 7.7% and a 2.5% commission for Khemlani. The claim threw serious doubt on Dr. Jim Cairn’s claim that the government had not offered Khemlani a commission on a loan. Game also revealed that Connor and Khemlani were still in contact, something Connor denied in the Sydney Morning Herald.

Unfortunately, Khemlani had stalled on the loan, most notably when he had been asked to go to Zurich with Australian Reserve Bank officials to prove the funds were in the Union Bank of Switzerland. When it became apparent that Khemlani would never deliver Whitlam was forced to secure the loan through a major American investment bank. As a condition of that loan, the Australian government was required to cease all other loans activities. Consequentially, Connor had his loan raising authority revoked on May 20th, 1975.

The combination of existing economic difficulties with the political impact of the Loan’s Affair severely damaged to the Whitlam government. At a special one day sitting of the Parliament held on July 9th, Whitlam attempted to defend the actions of his government and tabled evidence concerning the loan. It was an exercise in futility, however. Malcolm Fraser authorised Liberal party senators – who held the majority in the upper house at the time – to force a general election by blocking supply.

And things were only about to get worse. In October 1975, Khemlani flew to Australia and provided Peter Game with telexes and statutory declarations Connor had sent him as proof that he and Connor had been in frequent contact between December 1974 and May 1975. When a copy of this incriminating evidence found its way to Whitlam, the Prime Minister had no other choice but to dismiss Connor and Cairns (though he did briefly make Cairns Minister for the Environment).

By mid-October, every metropolitan newspaper in Australia was calling on the government to resign. Encouraged by this support, the Liberals in the Senate deferred the Whitlam budget on October 16th. Whitlam warned Fraser that the Liberal party would be “responsible for bills not being paid, for salaries not being paid, for utter financial chaos.” Whitlam was alluding to the fact that blocking supply threatened essential services, Medicare rebates, the budgets of government departments and the salaries of public servants. Fraser responded by accusing Whitlam of bringing his own government to ruin by engaging in “massive illegalities.”

On October 21st, Australian’s longest-serving Prime Minister, Sir Robert Menzies (1894 – 1978) signalled his support for Fraser and the Liberals. The next day, Treasurer, Bill Hayden (1933 – ) reintroduced the budget bills and warned that further delay would increase unemployment and deepen a recession that had blighted the western world since 1973.

The crisis would come to a head on Remembrance Day 1975. Whitlam had asserted for weeks that the Senate could not force him into an election by claiming that the House of Representatives had an independence and an authority separate from the Senate.

Whitlam had decided that he would end the stalemate by seeking a half-senate election. Little did he know, however, that the Governor-General, Sir John Kerr (1914 – 1991) had been seeking legal advice from the Chief Justice of the High Court on how he could use his Constitutional Powers to end the deadlock. Kerr had come to the conclusion that should Whitlam refuse to call a general election, he would have no other alternative but to dismiss him.

And this is precisely what happened. With the necessary documents drafted, Whitlam arranged to meet Kerr during the lunch recess. When Whitlam refused to call a general election, Kerr dismissed him and, shortly after, swore in Malcolm Fraser as caretaker Prime Minister. Fraser assured Kerr that he would immediately pass the supply bills and dissolve both houses in preparation for a general election.

Whitlam returned to the Lodge to eat lunch and plan his next movie. He informed his advisors that he had been dismissed. It was decided that Whitlam’s best option was to assert Labor’s legitimacy as the largest party in the House of Representatives. However, fate was already moving against Whitlam. The Senate had already passed the supply bills and Fraser was drafting documents that would dissolve the Parliament.

At 2pm, Deputy Prime Minister, Frank Crean (1916 – 2008) defended the government against a censure motion started by the opposition. “What would happen, for argument’s sake, if someone else were to come here today and say he was now the Prime Minister of this country”, Crean asked. In fact, Crean was stalling for time while Whitlam prepared his response.

At 3pm, Whitlam made a last-ditch effort to save his government by addressing the House. Removing references to the Queen, he asked that the “House expresses its want of confidence in the Prime Minister and requests, Mr. Speaker, forthwith to advice His Excellency, the Governor-General to call the member of Wannon to form a government.” Whitlam’s motion was passed with a majority of ten.

The speaker, Gordon Scholes (1931 – 2018) expressed his intention to “convey the message of the House to His Excellency at the first opportunity.” It was a race that Whitlam was not supposed to win. Scholes was unable to arrange an appointment until quarter-to-five in the afternoon.

Behind the scenes, departmental officials were working to provide Fraser with the paperwork he needed to proclaim a double dissolution. By ten-to-four, Fraser left for government house. Ten minutes later, Sir John Kerr had signed the proclamation dissolving both Houses of Parliament and set the date for the upcoming election for December 13th, 1975. Shortly after, Kerr’s official secretary, David Smith (1933) drove to Parliament House and, with Whitlam looming behind him, read the Governor General’s proclamation.

The combination of economic strife, political scandal, and Whitlam’s dismissal signed the death warrant for Whitlam’s government. At the 1975 Federal Election, the Liberal-National coalition won by a landslide, gaining a majority of ninety-one seats and obtaining a popular vote of 4,102,078. In the final analysis, it seems that the Australian people had agreed with Kerr’s decision and had voted to remove Whitlam’s failed government from power once and for all.

23163929155_9f41dc691d_h

Most of the arguments levelled against constitutional monarchies can be described as petty, childish, and ignorant. The biggest faux pas those who oppose constitutional monarchies make is a failure to separate the royal family (who are certainly not above reproach) from the institution of monarchy itself. Dislike for the Windsor family is not a sufficient reason to disagree with constitutional monarchy. It would be as if I decided to argue for the abolition of the office of Prime Minister just because I didn’t like the person who held that office.

One accusation frequently levelled against the monarchy is that they are an undue financial burden on the British taxpaying public. This is a hollow argument, however. It is certainly true that the monarchy costs the British taxpayer £299.4 million every year. And it is certainly true that the German Presidency costs only £26 million every year. However, it is not true that all monarchies are necessarily more expensive than Presidencies. The Spanish monarchy costs only £8 million per year, less than the Presidencies of Germany, Finland, and Portugal.

Australia has always had a small but vocal republican movement. The National Director of the Republican Movement, Michael Cooney has stated: “no one thinks it ain’t broken, that we should fix it. And no one thinks we have enough say over our future, and so, no matter what people think about in the sense of the immediate of the republic everyone knows that something is not quite working.”

History, however, suggests that the Australian people do not necessarily agree with Cooney’s assessment. The Republican referendum of 1999 was designed to facilitate two constitutional changes: first, the establishment of a republic, and, second, the insertion of a preamble in the Constitution.

The Referendum was held on November 6th, 1999. Around 99.14%, or 11,683,811 people, of the Australian voting public participated. 45.13%, or 5,273,024 voted yes. However, 54.87%, or 6,410,787 voted no. The Australian people had decided to maintain Australia’s constitutional monarchy.

All things considered, it was probably a wise decision. The chaos caused by establishing a republic would pose a greater threat to our liberties than a relatively powerless old lady. Several problems would need to be addressed. How often should elections occur? How would these elections be held? What powers should a President have? Will a President be just the head of state, or will he be the head of the government as well? Australian republicans appear unwilling to answer these questions.

Margaret Tavits of Washington University in St. Louis once observed that: “monarchs can truly be above politics. They usually have no party connections and have not been involved in daily politics before assuming the post of Head of State.” It is the job of the monarch to become the human embodiment of the nation. It is the monarch who becomes the centrepiece of pageantry and spectacle. And it the monarch who symbolises a nation’s history, tradition, and values.

Countries with elected, or even unelected, Presidents can be quite monarchical in style. Americans, for example, often regard their President (who is both the Head of State and the head of the government) with an almost monarchical reverence. A constitutional monarch might be a lifelong, unelected Head of State, but unlike a President, that is generally where their power ends. It is rather ironic that the Oxford political scientists, Petra Schleiter and Edward Morgan-Jones have noted that allow governments to change without democratic input like elections than monarchs are. Furthermore, by occupying his or her position as Head of State, the monarch is able to prevent other, less desirable people from doing so.

The second great advantage of constitutional monarchies is that they provide their nation with stability and continuity. It is an effective means to bridging the past and future. A successful monarchy must evolve with the times whilst simultaneously keeping itself rooted in tradition. All three of my surviving grandparents have lived through the reign of King George VI, Queen Elizabeth II, and may possibly live to see the coronation of King Charles III. I know that I will live through the reigns of Charles, King William V, and possibly survive to see the coronation of King George VII (though he will certainly outlive me).

It would be easy to dismiss stability and continuity as manifestations of mere sentimentality, but such things also have a positive effect on the economy, as well. In a study entitled Symbolic Unity, Dynastic Continuity, and Countervailing Power: Monarchies, Republics and the Economy Mauro F. Guillén found that monarchies had a positive impact on economies and living standards over the long term. The study, which examined data from one-hundred-and-thirty-seven countries including different kinds of republics and dictatorships, found that individuals and businesses felt more confident that the government was not going to interfere with their property in constitutional monarchies than in republics. As a consequence, they are more willing to invest in their respective economies.

When Wordsworth wrote his ode to Milton, he was mourning the loss of chivalry he felt had pervaded English society. Today, the West is once again in serious danger of losing those two entities that is giving them a connection to the chivalry of the past: a belief in God and a submission to a higher authority.

Western culture is balanced between an adherence to reason and freedom on the one hand and a submission to God and authority on the other. It has been this delicate balance that has allowed the West to become what it is. Without it, we become like Shakespeare’s Hamlet: doomed to a life of moral and philosophical uncertainty.

It is here that the special relationship between freedom and authority that constitutional monarchy implies becomes so important. It satisfies the desire for personal autonomy and the need for submission simultaneously.

The Christian apologist and novelist, C.S. Lewis (1898 – 1964) once argued that most people no more deserved a share in governing a hen-roost than they do in governing a nation:

“I am a democrat because I believe in the fall of man. I think most people are democrats for the opposite reason. A great deal of democratic enthusiasm descends from the idea of people like Rousseau who believed in democracy because they thought mankind so wise and good that everyone deserved a share in the government. The danger of defending democracy on those grounds is that they’re not true and whenever their weakness is exposed the people who prefer tyranny make capital out of the exposure.”

The necessity for limited government, much like the necessity for authority, comes from our fallen nature. Democracy did not arise because people are so naturally good (which they are not) that they ought to be given unchecked power over their fellows. Aristotle (384BC – 322BC) may have been right when he stated that some people are only fit to be slaves, but unlimited power is wrong because there is no one person who is perfect enough to be a master.

Legal and economic equality are necessary bulwarks against corruption and cruelty. (Economic equality, of course, refers to the freedom to engage in lawful economic activity, not to socialist policies of redistributing wealth that inevitably lead to tyranny). Legal and economic equality, however, does not provide spiritual sustenance. The ability to vote, buy a mobile phone, or work a job without being discriminated against may increase the joy in your life, but it is not a pathway to genuine meaning in life.

Equality serves the same purpose that clothing does. We are required to wear clothing because we are no longer innocent. The necessity of clothes, however, does not mean that we do not sometimes desire the naked body. Likewise, just because we adhere to the idea that God made all people equal does not mean that there is not a part of us that does not wish for inequality to present itself in certain situations.

Chivalry symbolises the best human beings can be. It helps us realise the best in ourselves by reconciling fealty and command, inferiority and superiority. However, the ideal of chivalry is a paradox. When the veil of innocence has been lifted from our eyes, we are forced to reconcile ourselves to the fact that bullies are not always cowards and heroes are not always modest. Chivalry, then, is not a natural state, but an ideal to be aimed for.

The chivalric ideal marries the virtues of humility and meekness with those of valour, bravery, and firmness. “Thou wert the meekest man who ever ate in hall among ladies”, said Sir Ector to the dead Lancelot. “And thou wert the sternest knight to thy mortal foe that ever-put spear in the rest.”

Constitutional monarchy, like chivalry, makes a two-fold demand on the human spirit. Its democratic element, which upholds liberty, demands civil participation from all its citizens. And its monarchical element, which champions tradition and authority, demands that the individual subjugate himself to that tradition.

It has been my aim in this essay to provide a historical, practical, and spiritual justification for constitutional monarchy. I have demonstrated that the British have developed ideals of liberty, justice, and good governance. The two revolutions of the 17th century – the English Civil War and the Glorious Revolution – established Great Britain as a constitutional monarchy. It meant that the monarch could not rule without the consent of parliament, established parliament as the supreme source of law, and allowed them to determine the line of succession. I have demonstrated that constitutional monarchs are more likely to uphold democratic principles and that the stability they produce encourages robust economies. And I have demonstrated that monarchies enrich our souls because it awakens in us the need for both freedom and obedience.

Our world has become so very vulgar. We have turned our backs on God, truth, beauty, and virtue. Perhaps we, like Wordsworth before us, should seek virtue, manners, freedom, and power. We can begin to do this by retaining the monarchy.

TRANSGENDERISM IS NO BASIS FOR PUBLIC POLICY

transgender-star-jumbo-v2

It has been over fourteen-year since David Reimer, the victim of an insane and evil scientific experiment, committed suicide. After his penis had been burnt off in a botched circumcision, David’s parents had turned to the infamous sexologist and social constructionist, Dr. John Money for help. Following Dr. Money’s advice, David’s parents agreed to allow a sex change operation to be performed on their young son and raised him as a girl.

Despite Dr. Money’s boasting that his experiment had been a success, however, David Reimer did not settle comfortably into his female identity. David tore up his dresses at three, asked if he could have his head shaved like his father, and engaged in all manner of boyish behaviour. David was bullied at school and, upon hitting puberty, decided that he was a homosexual (in reality, of course, he was heterosexual).

Finally, when he was fourteen David’s parents revealed the truth about his gender identity. David reverted to his masculine identity, broke off contact with Dr. Money whom he described as an abusive brainwasher, and received a non-functioning penis through phalloplasty. Unable to handle the immense psychological damage that had been inflicted upon him, David Reimer blew his brains out with a shotgun at the age of thirty-eight.

For all of human history, boy has meant boy and girl has meant girl. Traditionally, sex was used to refer to the biological markers of gender. If you were born with a penis and an XY chromosome, you were a man. If you were born with a vagina and an XX chromosome, you were a woman. One’s gender expression was thought to compliment one’s biological sex. A biological man would have masculine personality traits and a biological female would have feminine personality traits. These complimentary characteristics, among them body shape, dress, mannerisms, and personality, were thought to be produced by a mixture of natural and environmental forces.

Recently, however, gender theorists have begun to question the relationship between biological sex and gender identity. They argue that gender, which they see as distinctive from sex, is a social construct. Since gender refers to the expression of masculinity and femininity, gender is something that a person acquires. (Needless to say, this movement is driven by a pernicious post-modern, Neo-Marxist worldview). Under this philosophy, gender expression is the manner in which a person expresses their gender identity. Gender identity is expressed through dress, behaviour, speech, and nothing else besides.

Neuroplasticity provides the gender theorist with perhaps his greatest argument. If underlying brain processes are theoretically strengthened through repetitive use, it follows that gender identity comes from a narrowing down of potential gender categories through the repetitive use of certain brain processes. However, it also reveals a fatal flaw in the gender theorist’s (and social constructionist’s) philosophy. If the human brain is so malleable that an individual’s gender identity is constructed, then why can’t the brain of a transgender person be adapted out of its transgenderism?

The primary problem with gender theory is that it just plain wrong. The idea that gender is distinct from sex has absolutely no basis in science whatsoever. As Jordan Peterson, the Canadian psychology/philosopher, has stated: “the idea that gender identity is independent of biological sex is insane. It’s wrong. The scientific data is clear beyond dispute. It’s as bad as claiming that the world is flat.” Men and women differ both at the cellular and the temperamental level. Unlike men, for example, women menstruate, they can have babies, and they show a slew of personality characteristics that mark them as different from men. David C. Page, the Director of the Whitehead Institution at the Massachusetts Institute of Technology, has even claimed that genetic differences exist at the cellular level asserting that “throughout human bodies, the cells of males and females are biochemically different.” These differences even affect how men and women contract and fight diseases.

The philosopher Alain de Benoist has also strongly criticised gender theory. De Benoist argued against the scientific errors and philosophical absurdities in his work Non à la théorie de genre (No to Gender Theory).

First, De Benoist points out that the gender theorists have used the fact that some gender characteristics are socially constructed to argue that all characteristics are socially constructed.

Second, De Benoist argued that the “hormonal impregnation of the foetus” (as De Benoist puts it) causes the brain to become genderised because it has a “direct effect on the organisation of neural circuits, creating a masculine brain and a feminine brain, which can be distinguished by a variety of anatomical, physiological, and biochemical markers.”

Third, De Benoist argued that biological sex has a profound effect on the way people think, act, and feel. In order to support their theory, gender theorists are forced to deny the natural differences between men and women. De Benoist wrote:

“From the first days of life, boys look primarily at mechanized objects or objects in movement while girls most often search for visual contact with human faces. Only a few hours after birth, a girl responds to the cries of other infants while a boy shows no interest. The tendency to show empathy is stronger in girls than in boys long before any external influence (or “social expectations”) have been able to assert themselves. At all ages and stages of development, girls are more sensitive to their emotional states and to those of others than boys … From a young age, boys resort to physical strategies where girls turn to verbal ones … From the age of two, boys are more aggressive and take more risks than girls.”

Furthermore, gender theory cheapens what it means to be a man or a woman. And, by extension, it denigrates the contributions that each gender has to make to civil society. Gender values give people ideals to strive for and helps them determine the rules that govern human interactions. The idea that men and women ought to be treated the same is ludicrous beyond belief. No parent would like to see their son treat a woman the same way they treat their male friends. Men have been taught to be gentlemen and women have been taught to be ladies for a reason.

All of this is not to say, however, that those pushing transgender rights do not have a case. They are right when they claim that the transgender peoples of the world face discrimination, prejudice, and violence. Some countries treat transgenderism as a crime, and it is certainly true that transgender people are more likely to be victims of violence, including murder. A reasonable transgender rights argument would be that transgender people cannot help their affliction and that society ought to treat them with kindness, tolerance, and compassion.

Unfortunately, that is not the argument that gender activists like to make. Rather than focusing on promoting tolerance, gender activists have instead sought to do away with gender distinctions altogether (which is, more likely than not, their actual aim). Using a very tiny minority of the population as their moral basis, the gender activists are attempting to force society to sacrifice its traditional classifications of male and female.

Transgenderism is clearly a mental health disorder. In the past, it was referred to as “gender dysphoria”, considered a mental illness, and treated as such. To assert the fact that transgenderism is a mental health disorder is not a denial of an individual’s integral worth as a human being. It is merely the acknowledgement of the existence of an objective reality in which gender is both binary and distinct. Unfortunately, this is not the attitude of those who influence public opinion. Consequently, programs for LGBTQ youth have seen an increase in youth who identify as transgender. The transgender journalist, Libby Down Under, has blamed instances of rapid-onset gender dysphoria on the normalisation of transgenderism in the culture. With a slew of celebrities coming out as transgender (former Olympian Bruce Jenner being a primary example), and with transgender characters being featured on numerous television shows, many teens and tweens have suddenly decided that they are transgender despite having no prior history of gender confusion.

Transgender youth increasingly feel that it is their right to express themselves however they please. And they feel that it is their right to silence all who dare to criticise or disagree with that expression. Cross-living, hormone therapy, and sex reassignment surgery are seen as part of this self-expression. Alarmingly, the mainstream response of psychotherapists to these children and adolescents is the “immediate affirmation of [their] self-diagnosis, which often leads to support for social and even medical transition.”

It is a classic case of political posturing overshadowing the pursuit of truth. Most youth suffering from gender dysphoria grow out of their predilection. Dr. James Cantor of the University of Toronto has cited three large-scale studies, along with other smaller studies, to show that transgender children eventually grow out of their gender dysphoria. The Diagnostic and Statistics Manual 5th Edition claims that desistance rates for gender dysphoria is seventy to ninety percent in “natal males” and fifty to eighty-eight percent in “natal females.” Similarly, the American Psychological Association’s Handbook of Sexuality and Psychology concludes that the vast majority of gender dysphoria-afflicted children learn to accept their gender by the time they have reached adolescence or adulthood.

It is not a secret that transgenderism lends itself to other mental health problems. Forty-one percent of transgender people have either self-harmed or experienced suicidal ideation (this percentage, of course, does not reveal at what stage of transition suicidal ideation or attempts occur). The postmodern, neo-Marxist answer to this problem is that transgender people are an oppressed minority and that they are driven to mental illness as a result of transphobia, social exclusion, bullying, and discrimination.

It is typical of the left to presume that society is to blame for an individual’s suffering. And to a certain extent, they are right. Transgender people are the victims of discrimination, prejudice, and violence. But it is more than likely that these abuses exacerbate their problems rather than causing them. One in eight transgender people, for example, rely on sex and drug work to survive. Is that the fault of society or the fault of the individual? The National Center for Transgender Equality claims that it is common for transgender people to have their privacy violated, to experience harassment, physical and sexuality violence, and to face discrimination when it comes to employment. They claim that a quarter of all transgender people have lost their jobs and three-quarters have faced workplace discrimination because of their transgender status.

In Australia, there has been a move to allow transgender children access to hormone-blocking drugs and sex-change surgeries. Australian gender activists – surprise, surprise – support the idea of as a way to reduce the rates of suicide among transgender people. The Medical Journal of Australia has approved the use of hormone therapy on thirteen-year-olds despite the fact that the scientific community remains, as of 2018, undecided on whether or not puberty-blocking drugs are either safe or reversible.

In the United States, a great deal of debate has occurred over transgender rights. In particular, there have been debates over what bathroom they should be allowed to use, how they should be recognised on official documents, and whether they should be allowed to serve in the military. In 2016, former President Barack Obama ordered state schools to allow transgender students to use whatever bathroom they desire. Similar ordinances have been passed in hundreds of cities and counties across the United States. Seventeen states and the District of Columbia are subject to ‘non-discrimination’ laws which include gender identity and gender expression. These include restrooms, locker rooms, and change rooms.

In March of 2016, North Carolina passed a law which required people in government buildings to use the bathroom appropriate to their biological gender. The US Federal Government decried the decision as bigotry and accused the government of North Carolina of violating the Civil Rights Act. The Federal Government threatened to withhold over US$4 billion in education funding. The government of North Carolina responded by filing suit against the government of the United States. The US government responded by filing suit against North Carolina. North Carolina received support from Mississippi, Tennessee, and Texas whilst Washington received support from most of the northern states.

Pro-transgender bathroom policies are not limited to government, however. Many businesses in the United States have similar bathroom policies. Many large corporations, among them Target, allow transgender people to use the bathroom of their choice. And they are perfectly prepared to enforce these policies, as well. A Macy’s employee in Texas was fired after he refused to allow a man dressed as a woman to use the female change rooms. Similarly, Planet Fitness revoked the membership of a woman who complained that a transgender man was in the female change rooms.

The most alarming trend of the gender theory movement is the attempt to indoctrinate children through changes to the education system. In 2013, France unleashed the ABCD de l’égalité (the ABCs of Equality) on six hundred elementary schools. In their own words, the program was designed to teach students that gender was a social construct:

“Gender is a sociological concept that is based on the fact that relations between men and women are socially and culturally constructed. The theory of gender holds that there is a socially constructed sex based on differentiated social roles and stereotypes in addition to anatomical, biological sex, which is innate.”

The creators of the program are smart enough to include the disclaimer: “biological differences should not be denied, of course, but those differences should not be fate.”

Fortunately, it would seem that many people are not taken in by this race to fantasyland. They are not taken in by the idea that the program merely exists to combat gender stereotypes and teach respect, and have protested. The French Minister of Education dismissed the protestors by saying that they “have allowed themselves to be fooled by a completely false rumour… at school we are teaching little boys to become little girls. That is absolutely false, and it needs to stop.” In America, The Boston Globe dismissed the protests against the program as being motivated by fear. Judith Butler event went as far as to say that France’s financial instability was the true cause of the protests.

And such a profound misuse of the education system isn’t limited to France, either. In Scotland, teachers are given guidance by LGBT Youth Scotland, children are expected to demonstrate “understanding of diversity in sexuality and gender identity”, and children are allowed to identify as either a girl or boy, or neither. The government of the United Kingdom has mandated that transgender issues be taught as part of the sex and relationships curriculum in primary and secondary school. Justine Greening, the education secretary, said: “it is unacceptable that relationships and sex education guidance has not been updated for almost twenty years especially given the online risks, such as sexting and cyberbullying, our children and young people face.”

It is in Australia, however, that there is the most shocking case of gender theory indoctrination. A great deal of controversy has been generated over the Safe Schools program. The program, which was established by the Victorian government in 2010, is supposedly designed to provide a safe, supportive, and inclusive environment for LGBTI students. It states that schools have the responsibility to challenge “all forms of homophobia, biphobia, transphobia, intersexism to prevent discrimination and bullying.”

The Safe Schools program promotes itself as an anti-bullying resource supporting “sexual diversity, intersex and gender diversity in schools.” It requires Victorian schools to eliminate discrimination based on gender identity, intersex, and sexual orientation, including in terms of an inclusive school environment.

The program addresses the issues of sleeping and bathroom arrangements and dress code. In terms of dress code, the program states:

“An inflexible dress code policy that requires a person to wear a uniform (or assume characteristics) of the sex that they do not identify with is likely to be in breach of anti-discrimination legislation including under the Equal Opportunity Act (1984) SA”

Likewise, the program states on the issue of bathrooms and change rooms that “transgender and diverse students should have the choice of accessing a toilet/changeroom that matches their gender identity.” In addition, the program states:

“Schools may also have unisex/gender neutral facilities. While this is a helpful strategy for creating an inclusive school environment for gender diverse students broadly, it is not appropriate to insist that any student, including a transgender student, use this toilet if they are not comfortable doing so.”

The idea that a transgender boy or girl should be allowed to sleep, shower, and defecate in the same place as a group of boys or girls ought to ring alarm bells for everyone. It increases the risk of sexual activity, sexual assault, pregnancy, and the transmission of sexually-transmitted-diseases. There is a reason why schools segregate changerooms, toilets, and dormitories.

The tragedy of David Reimer reveals just how dangerous it is to ignore the truth in favour of a false and malevolent social philosophy. It is one thing to seek tolerance and compassion for those in the community who may be struggling with their identity. It is something else entirely to use the plight of transgender peoples as a means of cording society to change the way it categorises gender. And it is completely insane to allow a false philosophy like gender theory to be used as the basis of public policy. If we don’t want more tragedies like David Reimer’s, we should put gender theory out in the trash where it belongs.

DEMAND-SIDE ECONOMICS VERSUS SUPPLY-SIDE ECONOMICS

bn-uq280_feduci_gr_20170810142213

On May 9th, 2018, the YouTube Channel, Juice Media uploaded a video entitled “Honest Government Ad: Trickle Down Economics.” In the video, the rather obnoxious and condescending female presenter tells the audience that the reason Australia has “one of the fastest growing inequality rates in the world” is trickle-down economics, which she defines as “when we [the government] piss on you and tell you it’s raining.”

According to the video, tax cuts for investors, entrepreneurs, and business are directly correlated with poverty and the lack of wage growth in Australia. The presenter argues that the government cuts taxes on the rich while simultaneously claiming that they don’t have enough money for healthcare (which would be a lot more effective if people took responsibility for their own health), renewable energy (which is really an excuse to take control of the energy market), and the ABC (which doesn’t deserve a cent of anyone’s money).

The primary problem with the video is that the premise of its argument does not actually exist. There is not a single economic theory that can be identified as trickle-down economics (also known as trickle-down theory). No reputable economist has ever used the term, nor have they ever presented an argument that could be said to conform to the idea of what it is supposed to be. As Thomas Sowell (1930 – ) wrote in his book, Basic Economics:

“There have been many economic theories over the centuries accompanies by controversies among different schools and economists, but one of the most politically prominent economic theories today is one that has never existed among economists: the trickle-down theory. People who are politically committed to policies of redistributing income and who tend to emphasise the conflicts between business and labour rather than their mutual interdependence often accuse those opposed to them of believing that benefits must be given wealthy in general, or to business in particular that these benefits will eventually trickle down to the masses of ordinary people. But no recognised economist of any school of thought has ever had any such theory or made any such proposal.”

The key to understanding why political players disparage pro-capitalist and pro-free market economic policies as trickle-down economics is understanding how economics is used to deceive and manipulate. Political players understand that simple and emotionally-charged arguments tend to be more effective because very few people understand actual economics. Anti-capitalists and anti-free marketeers, therefore, use the term trickle-down economics to disparage economic policy that disproportionately benefits the wealthy in the short term, and increases the standards of living for all peoples in the long-term

The economic theory championed by liberals (read: leftists) is demand-side economics. Classical economics rejected demand-side economic theory for two reasons. First, manipulating demands is futile because demand is the result of product, not its cause. Second, it is (supposedly) impossible to over-produce something. The French economist, Jean-Baptiste Say (1767 – 1832) demonstrated the irrelevance of demand-side economics by pointing out that demand is derived from the supply of goods and services to the market. As a consequence of the works of Jean-Baptiste Say, the British economist, David Ricardo (1772 – 1823), and other classical economists, demand-side economic theory lay dormant for more than a century.

One classical economist, however, was prepared to challenge the classical economic view of demand-side economics. The English economist, Thomas Robert Malthus (1766 – 1834) challenged the anti-demand view of classical economics by arguing that the recession Great Britain experienced in the aftermath Napoleonic Wars (1803 – 1815) was caused by a failure of demand. In other words, purchasing power fell below the number of goods and services in the market. Malthus wrote:

“A nation must certainly have the power of purchasing all that it produces, but I can easily conceive it not to have the will… You have never I think taken sufficiently into consideration the wants and tastes of mankind. It is not merely the proportion of commodities to each other but their proportion to the wants and tastes of mankind that determines prices.”

Using this as his basis, Malthus argued that goods and services on the market could outstrip demand if consumers choose not to spend their money. Malthus believed that while production could increase demand, it was powerless to create the will to consume among individuals.

Demand-side economics works on the theory that economic growth can be stimulated by increasing the demand for goods and services. The American economist, J.D. Foster, the Norman B. Ture Fellow in the Economics of Fiscal Policy at the Heritage Foundation, argued that demand-side works on the theory that the economy is underperforming because the total demand is low, and, as a consequence, the supply needed to meet this demand is likewise low.

The American economist, Paul Krugman (1953 – ), and other economists believe that recessions and depressions are the results of a decrease in demand and that the most effective method of revivifying the economy is to stimulate that demand. The way to do this is to engage in large-scale infrastructure projects such as the building of bridges, railways, and highways. These projects create a greater demand for things like steel, asphalt, and so forth. And, furthermore, it provides people with a wage which they can spend on things like food, housing, clothing, entertainment, so on and so forth.

Policies based on demand-side economics aims to change the aggregate demand in the economy. Aggregate demand is consumer spending + investment + net import/export. Demand-side economics policies are either expansive or contractive. Expansive demand-side policies aim at stimulating spending during a recession. By contrast, contractive demand-side policies aim at reducing expenditure during an inflationary economy.

Demand-side policy can be split into fiscal policy and monetary policy. The purpose of fiscal policy in this regard is to increase aggregate demand. Demand-side based fiscal policy can help close the deflationary gap but is often not sustainable over the long-term and can have the effect of increasing the national debt. When such policies aim at cutting spending and increasing taxes, they tend to be politically unpopular. But when such policies that involve lowering taxes and increasing spending, they tend to be politically popular and therefore easy to execute (of course they never bother to explain where they plan to get the money from).

In terms of monetary policy, expansive demand-side economic aims at increasing aggregate demand while contractive monetary policy in demand-side economics aims at decreasing it. Monetary expansive policies are less efficient because it is less predictable and efficient than contractive policies.

Needless to say, demand-side economics has plenty of critics. According to D.W. McKenzie of the Mises Institute, demand-side economics works on the idea that “there are times when total spending in the economy will not be enough to provide employment to all want to and should be working.” McKenzie argued that the “notion that economics as a whole, sometimes lacks sufficient drive derives from a faulty set of economic doctrines that focus on the demand side of the aggregate economy.” Likewise, Thomas Sowell argued in Supply-Side Politics that there is too much emphasis placed on demand-side economics to the detriment of supply-side economics. He wrote in an article for Forbes:

“If Keynesian economics stressed the supposed benefit of having government manipulate aggregate demand, supply-side economics stressed what the marketplace could accomplish, one it was freed from government control and taxes.”

blog-689-1024x614

John Maynard Keynes

The man who greatly popularised demand-side economics was the British economist, John Maynard Keynes (1883 – 1946). Keynes, along with many other economists, analysed the arguments of the classical economists against the realities of the Great Depression. Their analysis led many economists to question the arguments of the classical economists. They noted that classical economics failed to answer how financial disasters like the Great Depression could happen.

Keynesian economics challenged the views of the classical economists. In his 1936 book, The General Theory of Employment, Interest and Money (one of the foundational texts on the subject of modern macroeconomics) Keynes revivified demand-side economics. According to Keynes, output is determined by the level of aggregate demand. Keynes argued that resources are not scarce in many cases, but that they are underutilised due to a lack of demand. Therefore, an increase in production requires an increase in demand. Keynes’ concluded that when this occurs it is the duty of the government to raise output and total employment by stimulating aggregate demand through fiscal and monetary policy.

The Great Depression is often seen as a failure of capitalism. It popularised Keynesian economics and monetary central planning which, together, “eroded and eventually destroyed the great policy barrier – that is, the old-time religion of balanced budgets – that had kept America relatively peaceful Republic until 1914.”

David Stockman of the Mises Institute argues that the Great Depression was the result of the delayed consequences of the Great War (1914 – 1918) and financial deformations created by modern central banking. However, the view that the Great Depression was a failure of capitalism is not one shared by every economist. The American economist, Milton Friedman (1912 – 2006), for example, argued that the Great Depression was a failure of monetary policy. Friedman pointed out that the total quantity of money in the United States – currency, bank deposits, and so forth – between 1929 and 1933 declined by one-third. He argued that the Federal Reserve had failed to prevent the decline of the quantity of money despite having the power and obligation to do so. According to Friedman, had the Federal Reserve acted to prevent the decline in the quantity of money, the United States (and subsequently, the world) would only have suffered a “garden variety recession” rather than a prolonged economic depression.

It is not possible to determine the exact dimensions of the Great Depression using quantitative data. What is known, however, is that it caused a great deal of misery and despair among the peoples of the world. Failed macroeconomic policies combined with negative shocks caused the economic output of several countries to fall between twenty-five and thirty-percent between 1929 and 1932/33. In America between 1929 and 1933, production in mines, factories, and utilities fell by more than fifty-percent, stock prices collapsed to 1/10th of what they had been prior to the Wall Street crash, real disposable income fell by twenty-eight percent, and unemployment rose from 1.6 to 12.8 million.

According to an article for the Foundation for Economic Education, What Caused the Great Depression, the Great Depression occurred in three phases. First, the rise of “easy money policies” caused an economic boom followed by a subsequent crash. Second, following the crash, President Herbert Hoover (1874 – 1964) attempted to suppress the self-adjusting aspect of the market by engaging in interventionist policies. This caused a prolonged recession and prevented recovery. Hourly rates dropped by fifty-percent, millions lost their jobs (a reality made worse by the absence of unemployment insurance), prices on agricultural products dropped to their lowest point since the Civil War (1861 – 1865), more than thirty-thousand businesses failed, and hundreds of banks failed. Third, in 1933, the lowest point of the Depression, the newly-elected President Franklin Delano Roosevelt (1882 – 1945) combatted the economic crisis by using “new deal” economic policies to expand interventionist measures into almost every facet of the American economy.

fdrnewdeal

Let’s talk about the New Deal a little bit more. The New Deal was the name for the Keynesian-based economic policies that President Roosevelt used to try and end the Great Depression. It included forty-seven Congress-approved programs that abandoned laissez-faire capitalism and enacted the kind of social and economic reforms that Europe had enjoyed for more than a generation. Ultimately, the New Deal aimed to create jobs, provide relief for farmers, boost manufacturing by building partnerships between the private and public sectors, and stabilise the US financial system.

The New Deal was largely inspired by the events of the Great War. During the War, the US Government had managed to increase economic activity by establishing planning boards to set wages and prices. President Roosevelt took this as proof positive that it was government guidance, not private business, that helped grow the economy. However, Roosevelt failed to realise that the increase in economic activity during the Great War came as the result of inflated war demands, not as the achievement of government planning. Roosevelt believed, falsely, that it was better to have government control the economy in times of crisis rather than relying on the market to correct itself.

The New Deal came in three waves. During his first hundred days in office, President Roosevelt approved the Emergency Banking Act, Government Economy Act, the Civilian Conservation Corps, the Federal Emergency Relief Act, Agricultural Adjustment Act, Emergency Farm Mortgage Act, the Tennessee Valley Authority Act, the Security Act, Abrogation of Gold Payment Clause, the Home Owners Refinancing Act, the Glass-Steagall Banking Act, the National Industrial Recovery Act, the Emergency Railroad Transportation Act, and the Civil Works Administration.

In 1934, President Roosevelt bolstered his initial efforts by pushing through the Gold Reserve Act, the National Housing Act, the Securities Exchange Act, and the Federal Communications Act.

In 1935, the Supreme Court rejected the National Industrial Act. President Roosevelt, concerned that other New Deal programs could also be in jeopardy, embarked on a litany of programs that would help the poor, the unemployed, and farmers. Second-wave New Deal programs included Soil Conservation and Domestic Allotment Act, Emergency Relief Appropriation, the Rural Electrification Act, the National Labor Relations Act, the Resettlement Act, and the Social Securities Act.

In 1937, Roosevelt unleashed the third wave of the New Deal by aiming to combat budget deficits. It included the United States Housing Act (Wagner-Steagall), the Bonneville Power Administration, the Farm Tenancy Act, the Farm Security Administration, the Federal National Mortgage, the New Agriculture Adjustment Act, and the Labor Standards Act.

According to the historical consensus, the New Deal proved effective in boosting the American economy. Economic growth increased by 1.8% in 1935, 12.9% in 1936, and 3.3% in 1937. It built schools, roads, hospitals, and more, prevented the collapse of the banking system, reemployed millions, and restored confidence among the American people.

Some even claim that the New Deal didn’t go far enough. Adam Cohen, the author of Nothing to Fear: FDR’s Inner Circle and the Hundred Days that Created Modern America, claims that the longevity of the Depression (the American economy didn’t return to pre-depression prosperity until the 1950s) is evidence that more New Deal spending was needed. Cohen commented that the New Deal had the effect of steadily increasing GDP (gross domestic product) and reducing unemployment. And, which is more, it reimagined the US Federal government as a welfare provider, a stock-market regulator, and a helper of people in financial difficulty.

However, the historical consensus is not to say that the New Deal is without its critics. The New Deal was criticised by many conservative businessmen for being too socialist. Others, such as Huey Long (1893 – 1935), criticised it for failing to do enough for the poor. Henry Morgenthau, Jr. (1891 – 1967), the Secretary of the Treasury, confessed before Democrats in the House Ways and Means Committee on May 9th, 1939 that the New Deal had failed as public policy. According to Morgenthau, it failed to produce an economic recovery and did not erase historic unemployment. Instead, it created a recession – the Roosevelt Recession – in 1937, failed to adequately combat unemployment because it created jobs that were only temporary, became the costliest government program in US history, and wasted money.

Conservatives offer supply-side economics as an alternative to demand-side economics. Supply-side economics aims at increasing aggregate supply. According to supply-side economics, the best way to stimulate economic growth or recovery is to lower taxes and thus increase the supply of goods and services. This increase leads, in turn, to lower prices and higher standards of living.

The lower-taxes policy has proved quite popular with politicians. The American businessman and industrialist, Andrew Mellon (1855 – 1937) argued for lower taxes in the 1920s, President John Fitzgerald Kennedy (1917 – 1963) argued for lower taxes in the 1960s, and both President Ronald Reagan (1911 – 2004) and President George Walker Bush (1946 – ) lowered taxes in the 1980s and 2000s, respectively.

Supply-side economics works on the principle that producers will create new and better products if they are allowed to keep their money. Put simply, supply-side economics (supply merely refers to the production of goods and services) works on the theory that cutting taxes on entrepreneurs, investors, and business-people incentives them to invest more in their endeavours. This money can be invested in capital – industrial machinery, factories, software, office buildings, and so forth.

The idea that lower taxes lead to greater economic prosperity is one of the central tenants of supply-side economics. Supporters of supply-side economics believe that providing financial benefits for investors (cutting capital gains tax, for example) stimulates economic growth. By contrast, high taxes, especially those metered out on businesses, discourage investment and encourages stagnation.

Tax rates and tax revenue are not the same thing, they can move in opposite directions depending on economic factors. The revenue collected from income tax for each year of the Reagan Presidency was higher than the revenues collected during any year of any previous Presidency. It can be argued that people change their economic behaviour according to the way they are taxed. The problem with increasing taxes on the rich is that the rich will use legal, and sometimes illegal, strategies for avoiding paying it. A businessman who is forced to pay forty-percent of his business’ profits on taxation is less likely to increase his productivity. As a consequence, high tax rates on businesses leads to economic stagnation.

laffer

Supply-side supporters use Arthur Laffer’s (1940 – ) – an advisor to President Ronald Regan –  Laffer Curve to argue that lower taxes lead to higher tax revenue. The Laffer curve showed the dichotomy between tax revenue and the amount of tax that is collected. Laffer’s idea that the more taxation increased, the more tax revenue is collected. However, if taxes are increased beyond a certain point, less revenue is collected because people are no longer willing to make an economic contribution.

Taxation only works when the price of engaging in productive behaviour is likewise reduced. Daniel Mitchell of the Heritage Foundation stated in an article entitled a “Supply-Side” Success Story, that tax cuts are not created equally. Mitchell wrote: “Tax cuts based on the Keynesian notion of putting money in people’s pockets in the form of rebates and credits do not work. Supply-side cuts, by contrast, do improve economic performance because they reduce tax rates on work, saving, and investment.” Mitchell used the differences between the 2001 and 2003 tax cuts as evidence for his argument. Mitchell pointed out that tax collections fell after the 2001 tax cuts whereas they grew by six-percent annually after the 2003 cuts. Mitchell points out that job numbers declined after the 2001 cuts whereas net job creation averaged more than 150,000 after the 2003 cuts. Mitchell points out that economic growth averaged 1.9% after the 2001 tax cuts, compared to 4.4% after the 2003 cuts.

Proposals to cut taxes have always been characterised by its opponents as “tax cuts for the rich.” The left believes that tax cuts, especially cuts on the top rate of tax, does not spur economic growth for lower and middle-class people and only serves to widen income inequality. They argue that tax cuts benefit the wealthy because they invest their newfound money in enterprises that benefit themselves. Bernie Sanders (1941 – ), the Independent Senator from Vermont, has argued that “trickle-down economics” is pushed by lobbyists and corporations to expand the wealth of the rich. Whilst opponents of President Ronal Reagan’s tax cuts likewise referred to the policy as “trickle-down economics.”

In reality, the left-wing slander of tax cuts can best be described as “tax lies for the gullible.” The rich do not become wealthy by spending frivolously or by hiding their money under the mattress. The rich become rich because they are prepared to invest their money in new products and ventures that will generate greater wealth. In reality, it is far more prudent to give an investor, entrepreneur, or business owner a tax cut because they are more likely to use their newfound wealth more prudently.

According to Prateek Agarwal at Intelligent Economist, supply-side economics is useful for lowering the natural rate of unemployment. Thomas Sowell, a supporter of supply-side economics, claims that while tax cuts are applied primarily to the wealthy, it is the working and middle classes who are the first and primary beneficiaries. This occurs because the wealthy, in Sowell’s view, are more likely to invest more money in their businesses which will provide jobs for the working class.

The purpose of economic policy is to facilitate the economic independence of their citizens by encouraging economic prosperity. Demand-side economics and supply-side economics represent two different approaches to achieving this endeavour. Demand-side economics argues that economic prosperity can be achieved by having the government increase demand by taking control of the economy. By contrast, supply-side economics, which is falsely denounced as “trickle-down economics” by the likes of people like Juice Media, champions the idea that the best way to achieve economic prosperity is by withdrawing, as far as humanly possible, government interference from the private sector of the economy. Supply-side economics is the economic philosophy of freedom, demand-side economics is not.

THE INVASION OF EUROPE

refugee_3448982k

In January of 2017, Emillem Khodagholli, a refugee on probation for a raft of offences that included death threats and assault, Maisam Afshar, another refugee well-known to Swedish authorities, and a third unidentified man made their way to Upsala where they broke into a young woman’s apartment. Streaming their despicable crime on Facebook, the three men tore off the young woman’s clothing and raped her for three hours at gunpoint. Afterwards, Khodagholli taunted his barely conscious victim as she tried to call for help. “You got raped”, he gloated. “There, we have the answers. You’ve been raped.”

Modern Europe’s migration crisis represents the most significant existential problem the continent has ever faced. The migration of millions of non-Europeans represents the largest mass movement of people into Europe since the Second World War. According to the International Organization for Migration, around a million migrants migrated to Europe in 2015. These migrants primarily came from Syria (268,795), Afghanistan (127,830), Iraq (97,125), Eritrea (19,100), Pakistan (15,525), and Nigeria (12,910).

For the most part, journalists, politicians, advocacy groups, and private organisations have attempted to paint Europe’s migration crises as a human right’s problem mired in social justice and global inequality. They would have Europeans believe that the people migrating into their countries are doctors, engineers, and other learned professionals fleeing from persecution.

In reality, these migrants come from a host of Sub-Saharan African countries and are travelling to Europe for a myriad of different reasons, of which fleeing persecuting is only one. As the Netherland’s European Union commissioner, Frans Timmermans (1961 – ) pointed out: over half (sixty percent) of the people moving into Europe are not refugees, but economic migrants.

While the European Union remains committed to a pro-migration and open-borders policy, there remains the odd voice of dissent among their ranks. The President of Latvia, Valdis Zatlers (1955 – ) commented that while Europe was powerless (in his opinion) to stop migration, they could hope to manage the flow of people into their continent:

“We can’t stop this process, but we have not learnt how to manage it, and Europe was about ten years’ late to make decisions on illegal immigration and to help the countries where the migrants come from. In each country and in Europe as a whole, we have to think about how to manage the process and how to really decrease the expectations of people.”

Similarly, the Slovakian Prime Minister, Robert Fico (1962 – ) implored the European Union to put an end to the inflow of migrants. Fico described the Union’s distribution policy as an utter “fiasco” and warned they were committing ‘ritual suicide’ through their immigration policy.

swerapee_orig

The most notorious effect of ethnic crime in Europe has been the increase in sex crimes committed since millions of North African and Middle Eastern migrants poured into Europe. This begins with the sexual slavery of their own women. According to the PBS, as of September 2016 around eighty-percent of Nigerian women who made it to Italy have been forced into prostitution.

On January 9th, 2016, a forty-eight-year-old woman was raped by three Muslim men. On January 10th, 2016, a twenty-one-year-old West African man was arrested for raping a fifteen-year-old girl at a train station in Wuppertal. On January 15th, 2016, a public swimming pool in Borheim was forced to ban all male migrants following reports that they had been sexually assaulting the female patrons. On January 25th, 2016, a thirty-year-old Afghan man exposed himself to a nineteen-year-old woman on a public bus.

In Kiel, Germany, in 2016, three teenage girls, aged fifteen, sixteen, and seventeen, were stalked by two Afghani asylum seekers, aged nineteen and twenty-six, who filmed them on their mobile phones. A restaurant owner at the mall commented: “The moment they [male migrants] see a young woman wearing a skirt or any type of loose clothing, they believe they have a free pass.”

During New Year’s 2015/2016, thousands of women in Stuttgart, Cologne, and Hamburg were sexually assaulted. Remarkably, these crimes were ignored by the German authorities until eyewitness reports surfacing on social media forced them to take the problem seriously.

In Vienna, an Iraqi refugee who raped a ten-year-old boy at a public swimming pool had his conviction overturned by Austria’s Supreme Court despite watershed evidence proving his guilt. The court deemed that the refugee, who had excused his despicable crime by claiming it was a “sexual emergency”, could not have known that the act was non-consensual. Thankfully, the refugee was sentenced to seven years imprisonment at his retrial.

In England, the Pakistani comprised Rotherham child sex ring abducted, tortured, raped, and forced into prostitution at least fourteen-hundred young girls over a period of sixteen years. According to Jihad Watch, those posed to do something about the ring expressed “nervousness about identifying the ethnic origin of perpetrators for fearing of being thought of as racism.” Others were instructed by their managers not to disclose the ethnic origin of the perpetrators.

The Swedes boast one of the largest incidences of rape in the world. According to a 2015 article published by the Gatestone Institute, in the forty years since Sweden decided to become a multi-cultural society violent crime has increased by three-hundred percent and rape has increased by fourteen-hundred-and-seventy-two percent. In 1975, only four-hundred-and-twenty-one rapes were reported to Swedish police. In 2014, it was six-thousand-six-hundred-and-twenty. This increase in the number of reported rapes can partially be explained by the increase in the number of sexual activities that can be classified as rape, and partially by an increase in the number of women who may otherwise have been uncomfortable in reporting their rapes.

According to the Swedish National Council for Crime Prevention, twenty-thousand-three-hundred sexual assaults were reported. This included six-thousand-seven-hundred-and-twenty rapes. Statistics provided by the Swedish National Council for Crime Prevention reveals that rape victims are most likely to be young women aged between sixteen and twenty-four. In fifty-percent of cases, rape is likely to occur in a public place, as opposed to a residence (19%), the workplace or school (18%), or elsewhere (12%).

The migrant sex crime is essentially caused by three problems. First, cultural differences in attitudes towards women between migrants and native Europeans, the educational and economic gap experienced by migrants, and a refusal to acknowledge the root causes of the problem.

The majority of migrants pouring into Europe come from a culture and civilisation that treat women as second-class citizens. There appears to be a belief among young Muslim men that an uncovered woman is an adulterer or a prostitute, and that she is, therefore, ‘fair game.’ It is an attitude that professes that all uncovered and non-Muslim women can be used for a Muslim man’s sexual gratification. Doctor Abd Al-Aziz Fawzan, a teacher of Islamic law in Saudi Arabia, opined: “if a woman gets raped walking in public alone, then she, herself, is at fault. She is only seducing men by her presence. She should have stayed home like a Muslim woman.”

The problem is further exacerbated by the educational and economic gap experienced by migrants. As a result of their low skills and education, coupled with their inability to speak to speak the local language, many migrants are rendered virtually unemployable. Many of the migrants arriving in Europe will move further northward and find employment within illegal gangs that are often comprised of members of their own ethnic group.

Finally, the migrant sex crime is also borne out of an insipid refusal to acknowledge the root cause of the problem. “Every police officer knows he has to meet a particular political standard”, Rainer Wendt (1956 – ), the head of the German Police Union, stated. “It is better to keep quiet [about migrant crime] because you cannot go wrong.”

Europe is acting as the metaphorical canary in the coal mine. Europe’s decision to pursue relaxed immigration laws and open border policies has led to the mass influx of non-European migrants into their country. An unfortunate by-product of these decisions has been an increase in the number of sex crimes committed by migrants against native Europeans and a total refusal from the authorities to acknowledge the root cause of the problem. Europe acts as a stark reminder of what happens to a continent and country that refuses to police its borders correctly.

IT’S TIME FOR A RETURN TO TRADITION

18kb2imhneiqljpg

Modernity is in trouble. From the menace of migrant crime in Europe to the sexual transgressions rife in modern-day Hollywood, the moral argument for modernity is quickly waning. How did things go so wrong? And how do we fix it? Perhaps a return to traditional values and ideals are in order.

The modern world developed over hundreds of years. The post-medieval period has seen the advent of tolerance as a social and political virtue, the rise of the nation-state, the increased role of science and technology in daily life, the development of representative democracy, the creation of property rights, urbanisation, mass literacy, print media, industrialisation, mercantilism, colonisation, the social sciences, modern psychology, emancipation, romanticism, naturalist approaches to art and culture, and the development of existential philosophy.  From the computer to the mobile phone, the motor car to the aeroplane, the marvels of the modern world are all around us.

The modern world has replaced the Aristotelean and faith-based concept of human life that was popular in the Middle Ages with a worldview based on science and reason. Modern intellectualism, therefore, follows the example set forth by Cartesian and Kantian philosophy: mistrusting tradition and finding its roots in science and rationality.

Culturally and intellectually, the 21st century represents the postmodern era. Postmodernism can be difficult to define accurately because the various cultural and social movements that use it as their central philosophy define it for their own purposes. Jean-Franҫois Lyotard (1924 – 1998), who introduced the term in his 1979 book, The Postmodern Condition, defined postmodernism as “incredulity towards metanarratives.” Similarly, Encyclopedia Britannica defines it as a philosophical movement in opposition to the philosophical assumptions and values of modern Western philosophy.

Postmodernism came about as a reaction, indeed a rejection, to modernity. With its roots in the philosophies of Friedrich Nietzsche (1844 – 1900), Martin Heidegger (1889 – 1976), Sigmund Freud (1856 – 1939), and Karl Marx (1818 – 1883), the postmodernist rejects the philosophical theory of Foundationalism – the idea that knowledge is built upon a solid foundation – in favour of large-scale scepticism, subjectivism, and relativism.

The postmodernist likes to see himself as Beowulf fighting Grendel. That is, he likes to see himself as the mythical hero fighting the historical-critical monster. Inspired by doctrines of white privilege and toxic masculinity, and driven by an anti-capitalist (except when it comes to their I-phones), anti-racist (provided the person isn’t white), anti-imperialist (but only European imperialism), and anti-transphobic (because gender is a “social construct”) rhetoric, the post-modernist inspired neo-Marxists and social justice warriors have invaded the modern university and college campus.

Modernity and post-modernism have produced a swathe of existential and moral problems that the Western world has, as of yet, proved unable (or perhaps even unwilling) to solve. To begin, the modern world has abolished the central role that God, nature, and tradition has played in providing life with purpose. In spite of all its cruelty, the German sociologist, Max Weber (1864 – 1920) saw the Middle Ages as a highly humanistic period. Everything was considered to have a divine purpose. Even someone as lowly as a Medieval serf, for example, could feel that he had a role in God’s greater scheme. There was a sense of, as Martin Buber (1878 – 1965) puts it, “I-thou.” Modernity swapped “I-thou” for “I-it”. The human will replaced God as the ultimate arbiter of meaning.

This problem has been further exacerbated by the alienation of the human spirit to nature. Science, for all of its positive qualities, has had the effect of rendering nature meaningless. No longer is a thunderclap the voice of an angry God, nor does a cave contain a goblin or a mountain harbour a giant. Science may be an excellent means for understanding facts, but it is not a substitute for wisdom or tradition when it comes to determining human purpose. No longer does the natural world command the sense of reverential majesty that it once did.

The answer to the problems of the modern, and, by extension, post-modern, world is a revitalisation of the traditional beliefs, values, ideas, customs, and practices that have made the Western world great in the first place. We must reject the destructive ideas espoused by the postmodernists and work to revitalise our traditions. It is high time we started taking some pride in the traditions that have made our civilisation so great.

I Wandered Lonely as a Cloud

sunny-daffodil-bill-wakeley

This week for our cultural article we will be examining William Wordsworth’s (1770 – 1850) 1815 poem, I Wandered Lonely as a Cloud.

Biography

William Wordsworth was born on April 7th, 1770 in Cockermouth, Cumberland to John Wordsworth (1740 – 1783), a legal agent to the Earl of Lowther (1736 – 1802), and Ann Wordsworth (1747 – 1778). Wordsworth came as the second of John and Ann’s five children. Richard Wordsworth (1768 – 1816) came before him and was followed by Dorothy (1771 – 1855) (who would aid him throughout his career), Christopher (1774 – 1846), and John, Jr.

Wordsworth attended grammar school near Cockermouth Church as well as Ann Birkett’s school in Penrith. His love of the natural world began early stemming from his childhood living in a terraced garden house along the Derwent River.

Wordsworth experienced personal tragedy early in his life. In March of 1778, Ann Wordsworth died while visiting a friend in London. By June, Wordsworth’s beloved sister, Dorothy, had been sent to live with her mother’s cousin, Elizabeth Threlkheld (1745 – 1837), in Halifax. The pair would not be reunited until 1787. As if that wasn’t bad enough, John Wordsworth, Sr. would die in December of 1783 after being forced to spend a night out in the cold. Following the death of his father, Wordsworth and his brothers were sent to live at the house of Ann Tyson and attended school at Hawkshead. It was here that Wordsworth first began composing prose, an enterprise that was greatly encouraged by his headmaster, William Taylor.

In 1787, Wordsworth went to Cambridge University to attend St. John’s College as a sizar (an undergraduate student receiving financial assistance from the university). That same year, he published his first poem in The European Magazine. Although his academic career was unremarkable, Wordsworth managed to graduate with a Bachelor of Arts in 1791.

During his last term, Wordsworth and his friend, Robert Jones (1769 – 1835), embarked on a walking tour of Europe. The tour would prove to be a great influence on Wordsworth poetry which started in earnest while he was travelling through France and Switzerland. During his travels, Wordsworth was also exposed to the ravages of the French Revolution, an experience which his inspired his lifelong sympathy for the common man.

Between 1795 and 1800, Wordsworth and his sister, Dorothy, would move three times. In 1795, the pair used a legacy obtained from a close relative to move to Dorset. Two years later, they would move to Somerset where Wordsworth would become neighbours and close friends with the poet, Samuel Taylor Coleridge (1772 – 1834). Finally, in 1799, the pair would settle at Dove Cottage in Grasmere following a trip to Germany with Coleridge.

In 1802, Wordsworth returned to France with his sister to meet his illegitimate daughter, Caroline (1792 – 1862), whom he had conceived illegitimately while living in France. Upon his return, he married his childhood friend, Mary Hutchinson (1770 – 1859). Together, the couple sired five children: Reverend John Wordsworth (1803 – 1875), Dora Wordsworth (1804 – 1847), Thomas Wordsworth (1806 – 1812), Catherine Wordsworth (1809 – 1812), and William Wordsworth, Jr. (1810 – 1883).

In 1813, Wordsworth made the Distributor of Stamps for Westmoreland. Years later, following the death of Robert Southey (1774 – 1843), Wordsworth was made Poet Laureate. He died on April 23rd, 1850, in Rydal.

Poem

william-wordsworth-hires-cropped

I wandered lonely as a Cloud
That floats on high o’er Vales and Hills,
When all at once I saw a crowd,
A host of golden Daffodils;
Beside the Lake, beneath the trees,
Fluttering and dancing in the breeze.

Continuous as the stars that shine
And twinkle on the Milky Way,
They stretched in never-ending line
Along the margin of a bay:
Ten thousand saw I at a glance,
Tossing their heads in sprightly dance.

The waves beside them danced, but they
Out-did the sparkling waves in glee:—
A Poet could not but be gay
In such a jocund company:
I gazed—and gazed—but little thought
What wealth the shew to me had brought:

For oft when on my couch I lie
In vacant or in pensive mood,
They flash upon that inward eye
Which is the bliss of solitude,
And then my heart with pleasure fills,
And dances with the Daffodils.

Analysis

flat550x550075f

William Wordsworth is credited with ushering the English romantic movement. Accordingly, Wordsworth is remembered as an intensely spiritual and epistemological writer whose poetry moved away from the grand, moralising themes of the past towards that which explored the purity and beauty of nature.

I Wandered Lonely as a Cloud was first published in Poems in Two Volumes in 1807. (The version analysed here is the 1815 revised version). The poem was inspired by a long walk Wordsworth took with his sister, Dorothy, around Glencoyne Bay, Ullswater. During their walk, the pair came across a “long belt” of daffodils. Wordsworth became inspired to write the poem after reading his sister’s diary description of the walk:

“When we were in the woods beyond Gowbarrow park we saw a few daffodils close to the water side, we fancied that the lake had floated the seed ashore and that the little colony had so sprung up – But as we went along there were more and yet more and at last under the boughs of the trees, we saw that there was a long belt of them along the shore, about the breadth of a country turnpike road. I never saw daffodils so beautiful they grew among the mossy stones about and about them, some rested their heads upon these stones as on a pillow for weariness and the rest tossed and reeled and danced and seemed as if they verily laughed with the wind that blew upon them over the Lake, they looked so gay ever glancing ever changing. This wind blew directly over the lake to them. There was here and there a little knot and a few stragglers a few yards higher up but they were so few as not to disturb the simplicity and unity and life of that one busy highway – We rested again and again. The Bays were stormy and we heard the waves at different distances and in the middle of the water like the Sea.”

— Dorothy Wordsworth, The Grasmere Journal Thursday, 15 April   1802.

I Wandered Lonely as a Cloud consists of four stanzas with six lines each and featured an “ababcc” rhyming sequence. The poem has a peaceful and tranquil feel to it which is expressed through simplistic language, figurative vocabulary, and subtle rhymes. The first three stanzas of the poem describe the narrator’s experiences. Its first line, “I wandered lonely as a cloud” serves to personalise the poem. Likewise, the reference to “a crowd, a host of golden daffodils” describes an ideal place, a form of euphoric paradise which the narrator experiences for the briefest period of time. The second stanza gives the impression that the daffodils were majestic, even other-worldly in their beauty. The narrator even compares them to the stars of the milky way. The poem’s last stanza details the poet’s recollection of his experiences. He describes how his recollection causes his heart to fill the pleasure and “dance with the daffodils.” In the end, I Wandered Lonely as a Cloud reminds us that beauty can only be found when we are willing to slow down and take notice of the world around us.

THE LEGACY OF MARGARET THATCHER

slide_290640_2310693_free

Margaret Thatcher (1925 – 2013) is a titan of world politics. A conservative heavyweight who effectively championed the conservative ethos in the public sphere and, in doing so, managed to transform her country for the better.

Margaret Thatcher was born Margaret Hilda Roberts on October 13th, 1925 above a green grocer’s store in Grantham, Lincolnshire. Thatcher was an ambitious and driven student who won scholarships to Kesteven and Grantham Girls’ school and Oxford University. After university, Thatcher worked as a chemist but abandoned it to study for the legal bar after meeting her husband Dennis Thatcher (1915 -2003), whom she married in 1954. Thatcher became a fully qualified lawyer that same year. Thatcher became the Conservative member for Finchley in 1959.

During her rise to power, Thatcher was not massively popular. Facing oppositions because of her gender – when she was elected she was one of only twenty-four female Parliamentarians (out of six-hundred members) and, even more unusually, was the mother of twins – and her social class. The Conservative Party had not changed its structure since the 19th century. She was often denounced as the “grocer’s daughter”, one conservative politician even commented that she was “a good-looking woman without doubt, but common as dirt.” In spite of these barriers, Thatcher managed to rise through numerous junior ministerial positions to become the shadow education spokeswoman in 1967. She became the Secretary of State for Education and Science when Edward Heath (1916 – 2005) became Prime Minister in June of 1970. Thatcher became the leader of the Conservative Party in 1975.

Margaret Thatcher was conservative Prime Minister of Great Britain from 1979 to 1990 and in her time, she changed Britain and helped define the times she lived in. Thatcher became Prime Minister after defeating James Callaghan (1912 – 2005) with a seven percent majority. There were many reasons for the conservative victory, the main ones being economic failure and the lack of union control. Thatcher was seen as aggressive but also as something of a paradox. She was the first scientist in Downing Street and was enthusiastic in pushing Great Britain’s technological innovations forward, but was an anti-counterculture revolutionary who opposed trade unions and the socialism they represented.

During Thatcher’s first term, however, it was the economy that needed the most attention. By the late 1970s inflation in Great Britain had peaked at twenty percent due to rising oil prices and wage-push inflation. The once mighty nation had become known as the ‘sick man of Europe’. According to the Organisation for Economic Cooperation and Development, by 1980/81 Britain was suffering from downward trends in employment and productivity. The great industrial cities were in decline. Glasgow, for example, had seen a decline in its population from 1.2 million following World War One to eight hundred thousand in the early 1980s. In some areas of Glasgow, male unemployment would remain at between sixty and seventy percent throughout the 1980s.  The director of the Department of Applied Economics, Wayne Godfrey, stated on the prospect of the 1980s: “it is a prospect so dreadful I cannot really believe there won’t be a sort of political revolution which will demand a basic change to policy.”

Inflation, particularly cost-push inflation, was seen as the biggest enemy. However, Thatcher knew that tackling inflation would require restricting the flow of money and causing mass job losses. It was a sacrifice she was willing to make. The government had a three-step process for tackling the issue. First, they increased interest rates. Second, they reduced the budget deficit by raising taxes and cutting government spending. Third, they pursued monetarist policies to control the supply of money. Despite great job losses, the economy slowly improved over Thatcher’s first two years in power.

In 1981, however, her policies caused a recession and unemployment peaked at three million. In fact, unemployment would remain a characteristic of the 1980s. Following the recession, Great Britain saw a period of economic growth with inflation dropping below four percent, although unemployment soared to 3.2 million before easing off a little. It is also of note that despite the mass unemployment, average earnings were, in fact, rising twice as fast inflation and those in employment had it better than ever. The Secretary of Transport, David Howell (1936 – ), stated in 1983: “if the conservative revolution has an infantry, it is the self-employed. It is in the growth of the self-employed, spreading out to small family businesses, that the job opportunities of the future are going to come.”  Thatcher’s biggest achievement in her first term, and the one which endeared her most to the British public was the Falklands War. Following the Argentinean surrender in 1982, Thatcher stated: “today has put the great back into Britain.” The Falklands War rekindled the British public’s pride in her navy and in the nation, itself.

listen-to-ronald-reagan-uncomfortably-apologizing-to-margaret-thatcher-after-invading-grenada

The Conservative Party won the 1983 election by an overwhelming majority. Thatcher had become the uncontested leader and saviour of the Conservative Party. Thatcher used the victory as an opportunity to change the configuration of the Conservative Party and reshape it in her image. She fired Foreign Secretary, Francis Pym (1922 – 2008) and sent the Home Secretary, William Whitelaw (1918 – 1999) to the House of Lords. Having ended the ancien regime, she refilled the front bench with dedicated Thatcherites. Only one old Etonian remained: Lord Chancellor Hailsham (1907 – 2001), who was eighty-five at the time. Thatcher then embarked on a policy of privatisation and deregulation with the intention of decreasing dependency on the government and encouraging personal responsibility.  Critics accused Thatcher of attempting to dismantle the welfare state and refusing to provide a base safety net for those down on their luck.  Unusually for an anti-socialist, Thatcher established the Greater London Council along with six metropolitan councils in an attempt to control local councils from Whitehall.

The conservatives won the 1987 election having lost twenty-one seats, but with a majority of more than one hundred. Thatcher focused on social issues and embarked on a program for social engineering. This was a seven-step process. First, the program actively encouraged women to stay at home and look after their children rather than join the workforce. Second, the program suggested putting the care of the old, unemployed and disabled into the hands of families. Third, the program suggested helping parents set up their own schools. Fourth, the program suggested providing support for schools with a clear, moral base, including religious schools. Fifth, the program suggested creating a voucher system to encourage parents to send their children to private schools. Sixth, the program suggested training children in the management of pocket money and the setting up of savings accounts. Seventh, the program wished to alter the way the public viewed wealth creation so that it would be seen as an admirable pursuit. Thatcher’s tenor as Prime Minister ended when she stood down from cabinet after her party refused to support her in a second round of leadership challenges. She was replaced by John Major (1943 – ).

After leaving office, Thatcher wrote two memoirs: The Downing Street Years (1993) and The Path to Power (1995). Thatcher was known as many things, including ‘The Last of the Eminent Victorians’, ‘New Britannia’, and, most famously, ‘The Iron Lady’. However, despite her many years in politics and her eleven years as Prime Minister, Thatcher was never a populist. This was probably because of her deep personal convictions which were stronger than her fear of the consequences. Thatcher did, however, demand and receive respect from the public. Satire almost always focused on her husband Dennis rather than on her. It is also worth noting that in her time Thatcher never lost an election. As a politician, Thatcher revolutionised political debate, transformed the Conservative Party, and altered many aspects of British life that had long been deemed permanent. Paul Johnson (1928 – ), a prominent English journalist, stated on Thatcher’s abilities as a politician: “though it is true in Margaret Thatcher’s case, she does have two advantages. She did start quite young. She does possess the most remarkable physical stamina of any politician I’ve come across.” In her time, Thatcher was determined to curb government subsidies to industry and to end the power of the trade unions. She made the trade unions liable for damages if their actions became unlawful and forced the Labour Party to modernise itself. Margaret Thatcher was an impressive and important Prime Minister whose political career and personality helped change Great Britain for the better.

BIBLIOGRAPHY

  1. British Broadcasting Corporation., 2001. Dome Woes Haunt Blair. [Online]
    Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/1172367.stm
    [Accessed 8 10 2014].
  2. British Broadcasting Corporation., 2008. 1979: Thatcher Wins Tory Landslide. [Online]
    Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/vote_2005/basics/4393311.stm
    [Accessed 10 8 2014].
  3. British Broadcasting Corporation., 2008. 1983: Thatcher Triumphs Again. [Online]
    Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/vote_2005/basics/4393313.stm
    [Accessed 8 10 2014].
  4. British Broadcasting Corporation., 2008. 1987: Thatcher’s Third Victory. [Online]
    Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/vote_2005/basics/4393315.stm
    [Accessed 8 10 2014].
  5. British Broadcasting Corporation., 2008. 1989: Malta Summit Ends Cold War. [Online]
    Available at: http://news.bbc.co.uk/onthisday/hi/dates/stories/december/3/newsid_4119000/4119950.stm
    [Accessed 12 10 2014].
  6. British Broadcasting Corporation., 2008. 1990: Thatcher Quits as Prime Minister. [Online]
    Available at: http://news.bbc.co.uk/onthisday/hi/dates/stories/november/22/newsid_2549000/2549189.stm
    [Accessed 8 10 2014].
  7. British Broadcasting Corporation., 2001. Dome Woes Haunt Blair. [Online]
    Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/1172367.stm
    [Accessed 8 10 2014].
  8. Chaline, E., 2011. Iron Maiden: Margaret Thatcher. In: History’s Worst Predictions and the People Who Made Them. England: Quid Publishing , pp. 194 – 199.
  9. Crewe, I and Searing D.D., 1988. Ideological Change in the British Conservative Party. The American Political Science Review, 82(2), pp. 361 – 384.
  10. Davies, S., 1993. Margaret Thatcher and the Rebirth of Conservatism. [Online]
    Available at: http://ashbrook.org/publications/onprin-v1n2-davies/
    [Accessed 28 09 2014].
  11. Elnaugh, R., 2013. Thatcher’s Children: Growing Up in 1980s Britain. [Online]
    Available at: http://www.channel4.com/news/thatchers-children-growing-up-in-1980s-britain
    [Accessed 5 10 2014].
  12. Garrett, G., 1992. The Political Consequences of Thatcherism. Political Behaviour, 14(4), pp. 361 – 382.
  13. Gray, J., 2004. Blair’s Project in Retrospect. International Affairs (Royal Institute of International Affairs 1944 -) , 80(1), pp. 39 – 48.
  14. Heffer, S., 2013. Kevin Rudd is Just Like Tony Blair. [Online]
    Available at: http://www.spectator.co.uk/australia/australia-features/8996621/kevin-rudd-is-just-like-tony-blair/
    [Accessed 29 09 2014].
  15. Jones, M., 1984. Thatcher’s Kingdom a View of Britain in the Eighties. Sydney: William Collins Pty Ltd. .
  16. King, A., 2002. The Outsider as Political Leader: The Case of Margaret Thatcher. British Journal of Political Science, 32(3), pp. 435 – 454.
  17. Kirkup J and Prince, R., 2008. Labour Party Membership Falls to Lowest Level Since it was Founded in 1900. [Online]
    Available at: http://www.telegraph.co.uk/news/politics/2475301/Labour-membership-falls-to-historic-low.html
    [Accessed 8 10 2014].
  18. Maxwell, S. a., 2007. Tony Blair’s Legacy 20% Jump in Amount of Legislation Introduced Per Year. [Online]
    Available at: https://www.sweetandmaxwell.co.uk/about-us/press-releases/010607.pdf
    [Accessed 8 10 2014].
  19. Merriam-Webster, 2014. Spin Doctor. [Online]
    Available at: http://www.merriam-webster.com/dictionary/spin%20doctor
    [Accessed 8 10 2014].
  20. McSmith, A, Chu, B, Garner, R, and Laurance, J., 2013. Margaret Thatcher’s Legacy: Spilt Milk, New Labour, and the Big Bang – She Changed Everything. [Online]
    Available at: http://www.independent.co.uk/news/uk/politics/margaret-thatchers-legacy-spilt-milk-new-labour-and-the-big-bang–she-changed-everything-8564541.html
    [Accessed 8 10 2014].
  21. McTernan, J., 2014. Tony Blair: His Legacy will be Debated But Not Forgotten. [Online]
    Available at: http://www.telegraph.co.uk/news/politics/tony-blair/10977884/Tony-Blair-His-legacy-will-be-debated-but-not-forgotten.html
    [Accessed 5 10 2014].
  22. Palmer, A., 1964. Conservative Partyy. In: The Penguin Dictionary of Modern History. Victoria: Penguin Books, pp. 90 – 90.
  23. Palmer, A., 1964. Labour Party. In: The Penguin Dictionary of Modern History 1789 – 1945. Victoria: Penguin Books , pp. 181 – 182.
  24. Pettinger, T., 2012. UK Economy in the 1980s. [Online]
    Available at: http://www.economicshelp.org/blog/630/economics/economy-in-1980s/
    [Accessed 5 10 2014].
  25. Purvis, J., 2013. What was Margaret Thatcher’s Legacy for Women?. Women’s History Review, 22(6), pp. 1014 – 1018.
  26. Silverman, J., 2007. Blair’s New Look Civil Liberties. [Online]
    Available at: http://news.bbc.co.uk/2/hi/uk_news/politics/4838684.stm
    [Accessed 8 10 2014].
  27. Thatcher, M., 1960. Public Bodies (Admission of the Press to Meetings) Bill. [Online]
    Available at: http://www.margaretthatcher.org/document/101055
    [Accessed 12 10 2014].
  28. Turner, L., 2011. Chariots of Fire: Tony Blair’s Legacy. [Online]
    Available at: http://www.themonthly.com.au/tony-blair-s-legacy-chariots-fire-lindsay-tanner-3183
    [Accessed 29 09 2014].
  29. K Government., 2014. Baroness Margaret Thatcher. [Online]
    Available at: https://www.gov.uk/government/history/past-prime-ministers/margaret-thatcher
    [Accessed 29 09 2014].
  30. UK Government., 2014. Tony Blair. [Online]
    Available at: https://www.gov.uk/government/history/past-prime-ministers/tony-blair
    [Accessed 29 09 2014].
  31. Warrell, M., 2013. Margaret Thatcher: An Icon of Leadership Courage. [Online]
    Available at: http://www.forbes.com/sites/margiewarrell/2013/04/08/margaret-thatcher-an-icon-of-leadership-courage/
    [Accessed 28 09 14].
  32. Younge, G., 2013. How Did Margaret Thatcher Do It?. [Online]
    Available at: http://www.thenation.com/article/173732/how-did-margaret-thatcher-do-it
    [Accessed 28 09 2014].

Free Speech Matters

19642011-2020free20speech2020pct

There has been an alarming trend in modern culture: numerous political and social activist groups have been attempting to use the pernicious and false doctrines of political correctness, tolerance, and diversity to silence those they disagree with. Many of these groups have sought the passage of so-called “hate speech” laws designed to silence voices of dissent.

At public colleges and universities, places where free speech and open debate should be actively encouraged, measures – including protests, disruption, and, in some cases, outright violence – taken to suppress voices of dissent has become tantamount to Government censorship. This censorship prevents students from inviting the speakers they wish to hear and debate speech they disagree with. Eva Fourakis, the editor-in-chief of The Williams Record (the student newspaper of Williams College) wrote an editorial, later recanted, commenting that “some speech is too harmful to invite to campus.” The editorial went on to say: “students should not face restrictions in terms of the speakers they bring to campus, provided of course that these speakers do not participate in legally recognised forms of hate speech.”

The University of California, Berkeley, is famous for sparking the free speech movement of the 1960s. Today, however, it has become a haven for radical, anti-free speech Neo-Marxists and social justice warriors. Not only have many Republican students had their personal property destroyed, but numerous conservative speakers have had their talks disturbed, and, in some cases, halted altogether. In February, Antifa – so-called anti-fascists – set fires and vandalised building during a speech by the controversial journalist, Milo Yiannopoulos (1984 – ). In April, threats of violence aimed at members of the Young Americas Foundation forced political commentator, Ann Coulter (1961 – ), to cancel her speech. A speech by David Horowitz (1939 – ), founder and president of the David Horowitz Freedom Center, was cancelled after organisers discovered that the event would take place during normal class times (for safety, or so they claimed). Finally, the conservative journalist, Ben Shapiro (1984 – ), was forced to spend US$600,000 on security for his speech at UC Berkeley. These events show that those who wish to use disruption, vilification, threats, and outright violence to silence others can be, and often are, successful in doing so.

unit-1-intro-hero-image_option-2

Like most the principles of classical liberalism, free speech developed through centuries of political, legal, and philosophical progress. And like many Western ideas, its development can be traced back to the Ancient Greeks. During his trial in Athens in 399BC, Socrates (470BC – 399BC) expressed the belief that the ability to speak was man’s most divine gift. “If you offered to let me off this time on condition I am not any longer to speak my mind”, Socrates stated, “I should say to you, ‘Men of Athens, I shall obey the Gods rather than you.”

Sixteen hundred years later, in 1215, the Magna Carta became the founding document of English liberty. In 1516, Desiderius Erasmus (1466 – 1536) wrote in the Education of a Christian Prince that “in a free state, tongues too should be free.” In 1633, the astronomist Galileo Galilei was put on trial by the Catholic Church for refusing to retract his claim of a heliocentric solar system. In 1644, the poet, John Milton (1608 – 1674), author of Paradise Lost, warned in Areopagictica that “he who destroys a good book kills reason itself.” Following the usurpation of King James II (1633 – 1701) by William III (1650 – 1702) and Mary II (1662 – 1694) in 1688, the English Parliament passed the English Bill of Rights which guaranteed free elections, regular parliaments, and freedom of speech in Parliament.

In 1789, the French Declaration of the Rights of Man and of the Citizen, an important document of the French revolution, provided for freedom of speech (needless to say, Robespierre and company were not very good at actually promoting this ideal). That same year, the philosopher Voltaire (1694 – 1778) famously wrote: “I detest what you write, but I would give my life to make it possible for you to continue to write.” Over in the United States, in 1791, the first amendment of the US Bill of Rights guaranteed freedom of religion, freedom of speech, freedom of the press, and the right to assemble:

ARTICLE [I] (AMENDMENT 1 – FREEDOM OF SPEECH AND RELIGION)

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of people peaceably to assemble, and to petition the Government for a redress of grievances.”

During the 19th century, the British philosopher, John Stuart Mill (1806 – 1873) argued for toleration and individuality in his 1859 essay, On Liberty. “If any opinion is compelled to silence”, Mill warned, “that opinion may, for aught we can certainly know, be true. To deny this is to presume our own infallibility.” Mill believed that all doctrines, no matter how immoral or offensive, ought to be given public exposure. He stated in On Liberty:

“If the argument of the present chapter are of any validity, there ought to exist the fullest liberty of professing and discussing, as a matter of ethical conviction, any doctrine, however immoral it may be considered.”

Elsewhere in On Liberty, Mill warned that the suppression of one voice was as immoral as the suppression of all voices:

“If all mankind minus one were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person than he, if he had the power, would be justified in silencing mankind.”

Centuries later, in 1948, the Universal Declaration of Human Rights, accepted unilaterally by the United Nations, urged member states to promote civil, human, economic, social, and political rights – including freedom of expression and religion.

31docket-master768

Supreme Court

 

Within the American Justice System, numerous Supreme Court cases have created judicial protections for freedom of speech. In the case of the Nationalist Socialist Party of America v. Village of Stoke (1977), the Supreme Court upheld the right of neo-Nazis to march through a village with a large Jewish population and wear Nazi insignia. The Justices found that the promotion of religious hatred was not a sufficient reason to restrict free speech.

In the city of St. Paul during the early 1990s, a white teenager was arrested under the “Bias-Motivated Crime Ordinance” after he burnt a cross made of a broken chair (cross-burning is commonly used by the Ku Klux Klan to intimidate African Americans) in the front yard of an African American family. The Court ruled that the city’s Ordinance was unconstitutional. Justice Antonin Scalia (1936 – 2016), noted that the purpose of restricting fighting words was to prevent civil unrest, not to ban the content or message of the speaker’s words. Scalia wrote in the case of R.A.V. v. City of St. Paul (1992):

“The ordinance applies only to ‘fighting words’ that insult, or provoke violence, ‘on the basis of race, colour, creed, religion or gender.’ Displays containing abusive invective, no matter how vicious or severe, are permissible unless they are addressed to one of the specified disfavored topics. Those who wish to use ‘fighting words’ in connection with other ideas—to express hostility, for example, on the basis of political affiliation, union membership, or homosexuality—are not covered. The First Amendment does not permit St. Paul to impose special prohibitions on those speakers who express views on disfavored subjects.”

In the Matal v. Tam case (2017), the Supreme Court found that a provision within the Lanham Act prohibiting the registration of trademarks that disparaged persons, institutions, beliefs, or national symbols violated the First Amendment. Justice Samuel Alito (1950 – ) opined:

“[The idea that the government may restrict] speech expressing ideas that offend … strikes at the heart of the First Amendment. Speech that demeans on the basis of race, ethnicity, gender, religion, age, disability, or any other similar ground is hateful; but the proudest boast of our free speech jurisprudence is that we protect the freedom to express ‘the thought that we hate’.”

Justice Anthony Kennedy (1936 – ) opined:

“A law found to discriminate based on viewpoint is an “egregious form of content discrimination,” which is “presumptively unconstitutional.” … A law that can be directed against speech found offensive to some portion of the public can be turned against minority and dissenting views to the detriment of all. The First Amendment does not entrust that power to the government’s benevolence. Instead, our reliance must be on the substantial safeguards of free and open discussion in a democratic society.”

gabrielle-giffords-hate-speech-murderjpg-b7cb13c5e1267ad6

In recent years, numerous calls to ban speech have been justified on the basis that it is “hateful.” Much of this has come from the political left who (in what one may cynically regard as having more to do with silencing voices of dissent than with protecting vulnerable groups) argue that restrictions on hate speech must occur if minorities are to be given equal status with everyone else.

That certain types of speech can be offensive, and that some of that speech may be aimed at certain groups of people, is undeniable. Hate speech has even been criticised for undermining democracy! In an article, Alexander Tsesis, Professor of Law at Loyola University, wrote: “hate speech is a threatening form of communication that is contrary to democratic principles.” Some have even argued that hate speech violates the fourteenth amendment to the US Constitution which guarantees equal protection under the law:

Article XIV (AMENDMENT 14 – RIGHTS GUARANTEED: PRIVILEGES AND IMMUNITIES OF CITIZENSHIP, DUE PROCESS, AND EQUAL PROTECTION)

1: All persons born or naturalised in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside. No state shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States; nor shall any State deprive any person of life, liberty, or property, without due process of law; nor deny any person within its jurisdiction the equal protection of the laws.

That there is a historical basis for restricting hate speech is undeniable. Slavery, Jim Crow, and the Holocaust, among other atrocities, were all proceeded by violent and hateful rhetoric. (Indeed, incitement to genocide is considered a serious war crime and a serious crime against humanity under international law.) Genocide is almost always preceded by hate speech. However, what proponents of hate speech laws fail to realise is that the countries that perpetrated these atrocities did not extend the freedom to speak to the groups that they were targeting. Joseph Goebbels (1897 – 1945), the Nazi minister for public enlightenment and propaganda, for example, had such an iron grip on Germany’s media that any voice contradicting the Nazi’s anti-Semitic propaganda had no opportunity to be heard.

Age

But who, exactly, supports hate speech laws? Analysis of survey data taken from Pew Research Center and YouGov reveals that it is primarily non-white, millennial democrats. In terms of age, the Pew Research Centre found that forty-percent of millennials supported Government censorship of hate speech, compared to twenty-seven percent of gen x-ers, twenty-four percent of baby-boomers, and only twelve percent of the silent generation.

race

In terms of race, research by YouGov reveals that sixty-two percent of African Americans support Government censorship of hate speech, followed by fifty percent of Hispanics, and thirty-six percent of White Americans.

political beliefs

In terms of political affiliation, research from YouGov taken in 2015 found that fifty-one percent of Democrats supported restrictions on hate speech, compared to thirty-seven percent of Republicans, and only thirty-five percent of independents.

The primary issue with hate speech is that determining what it does and does not constitute is very difficult. (The cynic may argue, fairly, that hate speech begins when the speaker expresses a view or states a fact or expresses an opinion that another person does not want others to hear.) As Christopher Hitchens (1949 – 2011) pointed out, the central problem with hate speech is that someone has to decide what it does and does not constitute.

The second issue with hate speech laws is that they can easily be used by one group to silence another. Often this kind of censorship is aimed at particular groups of individuals purely for ideological and/or political purposes, often with the justification that such actions increase the freedom and equality of the people the advocates claim to represent.

In Canada, Bill C-16 has sought to outlaw “hate propaganda” aimed at members of the community distinguishable by their gender identity or expression. The Bill originated with a policy paper by the Ontario Human Rights Commission which sought to determine what constituted discrimination against gender identity and expression. This included “refusing to refer to a person by their self-identified name and proper personal pronoun.”  Supporters of Bill C-16 see it as an important step towards the creation of legal protections for historically marginalised groups. Detractors, however, have expressed concern that the Bill creates a precedence for Government mandated speech.

The Canadian clinical psychologist and cultural critic, Professor Jordan Peterson (1962 – ), first came to public attention when he posted a series of YouTube videos warning of the dangers of political correctness and criticising Bill C-16. In his videos, Professor Peterson warned that the law could be used to police speech and compel individuals to use ‘transgender pronouns’ (these are terms like ‘ze’ and ‘zer’, among others). For his trouble, Peterson has been accused of violence by a fellow panellist on the Agenda with Steve Palkin, received two warning letters from the University of Toronto in 2016, and was denied a social research grant from Canada’s Social Sciences and Humanities Research Council.

Vor 80 Jahren wurde Adolf Hitler als Reichskanzler vereidigt

A Nazi torch-light rally. 

Europe has been experiencing similar attempts to silence speech. A law passed in the Bundestag this year will force social media companies operating in Germany to delete racist or slanderous comments and posts within twenty-four hours or face a fine of up to €50 million if they fail to do so. Additionally, numerous public figures have found themselves charged with hate speech crimes for merely pointing out the relationship between the large influx of non-European migrants and high crime rates, particularly in terms of rape and terrorism. One politician in Sweden was prosecuted for daring to post immigrant crime statistics on Facebook.

In Great Britain, British Freedom of Information documents reveal that around twenty-thousand adults and two-thousand children had been investigated by the police for comments that made online. In politics, British MP, Paul Weston (1965 – ), found himself arrested after he quoted a passage on Islam written by Winston Churchill (1874 – 1965). In Scotland, a man was charged under the 2003 Communication’s Act with the improper use of electronic communications after he filmed his dog making a Hitler salute.

In Australia, Herald Sun columnist, Andrew Bolt (1959 – ), was found to have contravened section 18C of the Racial Discrimination Act after he published articles accusing fair-skinned Aborigines of using their racial status for personal advantages. The law firm, Holding Redlich, speaking for a group of Aboriginal persons, demanded that the Herald Sun retract two Andrew Bolt articles, written in April and August of 2009, and restrain Bolt from writing similar articles in the future. Joel Zyngier, who acted for the group pro-bono, told Melbourne’s The Age:

“We see it as clarifying the issue of identity—who gets to say who is and who is not Aboriginal. Essentially, the articles by Bolt have challenged people’s identity. He’s basically arguing that the people he identified are white people pretending they’re black so they can access public benefits.”

Judge Morcedai Bromberg (1959 – ) found that the people targeted by Bolt’s articles were reasonably likely to have been “offended, insulted, humiliated, or intimidated.”

We need speech to be as free as possible because it is that which allows us to exchange and critique information. It through free speech that we are able to keep our politicians and public officials in check, that we are able to critique public policy, and that we are able to disseminate information. As the Canadian cognitive psychologist, Stephen Pinker (1954 – ), observed: “free speech is the only way to acquire knowledge about the world.” Measures taken to restrict free speech, whether it be the criminalization of hate speech or any other, is a complete contradiction of the principles that free Western democracies are founded upon.

FATS DOMINO

07-fats_circa1970-mezz

This week for our cultural article, we will be celebrating the life of Fats Domino: the legendary New Orleans rock ‘n’ roller who died last Tuesday at the age of eighty-nine.

Fats Domino was born Antoine Dominique Domino, Jr. on February 26th, 1928, in New Orleans, Louisiana. He was the youngest of Antoine Caliste Domino’s (1879 – 1964) and Marie-Donatille Gros’ (1886 – 1971) eight children. and introduced him to New Orleans’ music scene, which would be a major influence on his later music. Fats’ came from a musical family. At seven-years-old, he was taught to play the piano by his brother-in-law, Harrison Verret (1907 – 1965). Additionally, Verret also introduced Fats to the New Orleans’ music scene, which would become a major influence on his later music.

By the age of ten, Fats was performing as a singer and a pianist. Four years later, he dropped out of school completely to pursue a career in music. To support himself during this time, Fats took on odd jobs – factory work, hauling ice, and so forth. By 1946, Fats had begun playing leading piano with the well-known New Orleans bass player and bandleader, Billy Diamond (1916 – 2011). It was Diamond who gave Domino the nickname, “Fats”. Years later, Diamond would reminisce:

“I knew Fats from hanging out at a grocery store. He reminded me of Fats Waller and Fats Pichon. Those guys were big names and Antoine—that’s what everybody called him then—had just got married and gained weight. I started calling him ‘Fats’ and it stuck.”

Diamond’s audiences were impressed by Fat’s rare talents and by the end of the 1940s the New Orleans’ pianist had attracted a very substantial following. As a musician, Fats was versed in numerous musical styles – blues, boogie-woogie, ragtime – and had drawn inspiration from pianists like Meade Lux Lewis (1895 – 1964) and singers like Louis Jordan (1908 – 1975).

In 1949, Fats met his long-term collaborator, Dave Bartholomew (1920 – ). Around the same time, Fats signed a record contract with Imperial Records. Fats’ first song with the label, The Fat Man (a play on his own nickname), would sell a million copies and reach number two on the Rhythm and Blues Charts.

Fats stood out as a performer due to the combination of his baritone voice, unique piano-playing style, the saxophone rifts of Herbert Hardesty (1925 – 2016), and the drum after-beats of Earl Palmer (1924 – 2008). The release of Ain’t That A Shame in 1955 exposed Fats to the mainstream public and helped make him the most popular African American rock ‘n’ roll artist. His upward trajectory continued with two film performances in 1956: Shake, Rattle and Rock, and the Girl Can’t Help It, and the recording of five top-forty hits, including, My Blue Heaven, and Blueberry Hill (which reached number two).

By the early 1960s, however, Fats music had lost much of its original popularity. In 1963, he moved to ABC-Paramount Records and parted ways with his long-time collaborator, Dave Bartholomew. The arrangement would be short lived with Fats parting ways with ABC-Paramount, returning to New Orleans, and rekindling his professional relationship with Dave Bartholomew in 1965.

Fats and Bartholomew would collaborate until 1970, culminating in the 1968 cover of The Beatles’ Lady Madonna (ironically, a tribute to Fats Domino in and of itself). During this time, Fats failed to experience significant chart success. In 1986, Fats was inducted into the Rock and Roll Hall of Fame as part of their inaugural lists.

Fats retired from touring following a health scare in Europe in 1995. Outside of the occasional performance at the New Orleans’ Jazz and Heritage Festival, he lived a mostly private life with his wife, Rosemary Hall (1930 – 2008), and his eight children. In 1998, Fats accepted a National Medal of the Arts from President Bill Clinton (1946 – ).

Fats refused to leave New Orleans – and abandon his sick wife – during Hurricane Katrina. His home was badly flooded and he lost most of his possessions. He was rescued by the Coast Guard on September First. Following the disaster, Fats released Alive and Kicking and donated a proportion of the sales to the Tipitana Foundation which helped New Orleans’ struggling musicians.

Following the album’s release, Fats retreated back into private life and largely shunned publicity. In 2008, Rosemary Hall, his wife of fifty years, died of chronic illness. Fats joined her on October 26th, 2017, at the age of eighty-nine.

Fats Domino must be credited as a key pioneer of rock ‘n’ roll. Together with Jerry Lee Lewis (1935 – ) and Little Richard (1932 – ), Fats style of piano playing helped define the new genre of music and inspired dozens of future musicians. No wonder The Rolling Stone Record Guide likened him to Benjamin Franklin (1706 – 1790).