Liberal Semantics

The word “liberal” originated in Latin, then made its way into French and from there into English. The Oxford English Dictionary gives this as its primary definition
“Willing to respect or accept behaviour or opinions different from one’s own; open to new ideas.”
However, it also has a political usage as in “the liberal senator from Massachusetts.” This meaning and usage must be relatively new: for one thing, we know that “liberal” was not given a political connotation by Dr. Samuel Johnson in his celebrated dictionary of 1755:
    Liberal, adj. [liberalis, Latin, libėral, French]
1. Not mean; not low in birth; not low in mind.
2. Becoming a gentleman.
3. Munificent; generous; bountiful; not parcimonious.
So when did the good word take on that political connotation? Mystère.
We owe the attribution of a political meaning to the word to the Scottish Enlightenment and two of its leading lights, the historian William Robertson and the political economist Adam Smith. Robertson and Smith were friends and correspondents as well as colleagues at the University of Edinburgh; they used “liberal” to refer to a society with safeguards for private property and an economy based on market capitalism and free-trade. Roberts is given priority today for using it this way in his 1769 book The History of the Reign of the Emperor Charles V. On the other hand, many in the US follow the lead of conservative icon Friedrich Hayek who credited Smith based on the fact that the term appears in The Wealth of Nations (1776); Hayek wrote The Road to Serfdom (1944), a seminal work arguing that economic freedom is a prerequisite for individual liberty.
Today, the related term “classical liberalism” is applied to the philosophy of John Locke (1632-1704) and he is often referred to as the “father of liberalism.” His defense of individual liberty, his opposition to absolute monarchy, his insistence on separation of church and state, and his analysis of the role of “the social contract” provided the U.S. founding fathers with philosophical tools crucial for the Declaration of Independence, the Articles of Confederation and ultimately the Constitution. It is this classical liberalism that also inspired Simon Bolivar, Bernardo O’Higgins and other liberators of Latin America.
In the early 19th century, the Whig and Tory parties were dominant in the English parliament. Something revolutionary happened when the Whigs engineered the passage of the Reform Act of 1832 which was an important step toward making the U.K. a democracy in the modern sense of the term. According to historians, this began the peaceful transfer of power from the landed aristocracy to the emergent bourgeois class of merchants and industrialists. It also coincided with the end of the Romantic Movement, the era of the magical poetry of Keats and Shelley, and led into the Victorian Period and the well intentioned poetry of Arnold and Tennyson.
Since no good deed goes unpunished (especially in politics), passage of the Reform Act of 1832 also led to the demise of the Whig Party: the admission of the propertied middle class into the electorate and into the House of Commons itself split the Whigs and the new Liberal Party emerged. The Liberal Party was a powerful force in English political life into the 20th century. Throughout, the party’s hallmark was its stance on individual liberties, free-markets and free-trade.
Accordingly, in the latter part of the 19th century in Europe and the US, the term “liberalism” came to mean commitment to individual freedoms (in the spirit of Locke) together with support of free-market capitalism mixed in with social Darwinism. Small government became a goal: “That government is best that governs least” to steal a line from Henry David Thoreau.
Resistance to laissez-faire capitalism developed and led to movements like socialism and labor unions. In the US social inequality also fueled populist movements such as that led by William Jennings Bryan, the champion of Free Silver and other causes. Bryant, a brilliant orator, was celebrated for his “Cross of Gold” speech, an attack of the gold standard, in which he intoned
    “you shall not crucify mankind upon a cross of gold.”
He was a national figure for many years and ran for President on the Democratic ticket three times; he earned multiple nicknames such as The Fundamentalist Pope, the Boy Orator of the Platte, The Silver Knight of the West and the Great Commoner.
At the turn of the century, in the US public intellectuals like John Dewey began to criticize the basis of laissez-faire liberalism as too individualistic and too threatening to an egalitarian society. President Theodore Roosevelt joined the fray, led the “progressive” movement, initiated “trust-busting” and began regulatory constraints to rein big business in. The Sixteenth Amendment which authorized a progressive income tax made it through Congress and the state legislatures during this presidency.
At this time, the meaning of the word “liberal” took on its modern political meaning: “liberal” and “liberalism” came to refer to the non-socialist, non-communist political left – a position that both defends market capitalism and supports infrastructure investment and social programs that benefit large swaths of the population; in Europe the corresponding phenomenon is Social Democracy, though the Social Democrats tend to be more to the left and stronger supporters of the social safety net, not far from the people who call themselves “democratic socialists” in the US today.
On the other hand, the 19th century meaning of “liberalism” has been taken on by the term “neo-liberalism” which is used to designate aggressive free-market capitalism in the age of globalization.
In the first term of Woodrow Wilson’s presidency, Congress passed the Clayton Anti-Trust Act as well as legislation establishing the Federal Reserve System and the progressive income tax. Wilson is thus credited with being the founder of the modern Democratic Party’s liberalism – this despite his anti-immigrant stance, his anti-Catholic stance and his notoriously racist anti-African-American stance.
The great political achievement of the era was the 19th Amendment which established the right of women to vote. The movement had to overcome entrenched resistance, finally securing the support of Woodrow Wilson and getting the necessary votes in Congress in 1919. Perhaps, it is this that has earned Wilson his standing in the ranks of Democratic Party liberals.
Bryan, for his part a strong supporter of Wilson and his liberal agenda in the 1912 election, then served as Wilson’s first Secretary of State, resigning over the handling of the Lusitania sinking. His reputation has suffered over the years because of his humiliating battle with Clarence Darrow in the Scopes “Monkey” Trial of 1925 (Fredric March and Spencer Tracy resp. in “Inherit the Wind”); at the trial, religious fundamentalist Bryan argued against teaching human evolution in public schools. It is likely this has kept him off the list of heroes of liberal politics in the US, especially given that this motion picture, a Stanley Kramer “message film,” was an allegory about the McCarthy era witch-hunts. Speaking of allegories, a good case can be made that the Wizard of Oz is an allegory about the populist movement and the Cowardly Lion represents Bryan himself – note, for one thing, that in L. Frank Baum’s book, Dorothy wears Silver Shoes and not Ruby Slippers!
The truly great American liberal was FDR whose mission it was to save capitalism from itself by enacting social programs called for by socialist and labor groups and by setting up regulations and guard rails for business and markets. The New Deal programs provided jobs and funded projects that seeded future economic growth; the regulations forced capitalism to deal with its problem of cyclical crises, panics and depressions. He called for a “bank holiday,” kept the country more or less on the gold standard by issuing an executive order to buy up nearly all the privately held the gold in the country (hard to believe today), began Social Security and unemployment insurance, instituted centralized controls for industry, launched major public works projects (from the Lincoln Tunnel to the Grand Coulee Dam), brought electricity to farms, archived the nation’s folk music and folklore, sponsored projects which brought live theater to millions (launching the careers of Arthur Miller, Orson Welles, Eliza Kazan and many others) and more. This was certainly not a time of government shutdowns.
In the post WWII period and into the 1960s, there were even “liberal Republicans” such as Jacob Javits and Nelson Rockefeller; today “liberal Republican” is an oxymoron. The most daring of the liberal Republicans was Earl Warren, the one-time Governor of California who in 1953 became Chief Justice of the Supreme Court. In that role, Warren created the modern activist court, stepping in to achieve justice for minorities, an imperative which the President and the Congress were too cowardly to take on. But his legacy of judicial activism has led to a politicized Supreme Court with liberals on the losing side in today’s run of 5-4 decisions..
Modern day liberalism in the U.S. is also exemplified by LBJ’s Great Society which instituted Medicare and Medicaid and which turned goals of the Civil Rights Movement into law with the Civil Rights Act of 1964 and the Voting Rights Act of 1965.
JFK and LBJ were slow to rally to the cause of the Civil Rights Movement but in the end they did. Richard Nixon and the Republicans then exploited anti-African-American resentment in the once Democratic “solid South” and implemented their “Southern strategy” which, as LBJ feared, has turned those states solidly Republican ever since. The liberals’ political clout was also gravely wounded by the ebbing of the power of once mighty labor unions across the heartland of the country. Further, the conservative movement was energized by the involvement of ideologues with deep pockets like the Koch brothers and by the emergence of charismatic candidates like Ronald Reagan. The end result has been that only the West Coast and the Northeast can be counted on to elect liberal candidates consistently, places like San Francisco and Brooklyn
What is more, liberal politicians have lost their sense of mission and have failed America in many ways since that time as they have moved further and further to the right in the wake of electoral defeats, cozying up to Wall Street along the way. For example, it was Bill Clinton who signed the bill repealing the Glass-Steagall Act undoing one of the cornerstones of the New Deal; he signed the bill annulling the Aid to Families With Dependent Children Act which also went back to the New Deal; he signed the bill that has made the US the incarceration capital of the world, the Violent Crime Control and Law Enforcement Act.
Over the years, the venerable term “liberal” itself has been subjected to constant abuse from detractors. The list of mocking gibes includes tax-and-spend liberal, bleeding heart liberal, hopey-changey liberal, limousine liberal, Chardonnay sipping liberal, Massachusetts liberal, Hollywood liberal, … . There is even a book of such insults and a web site for coming up with new ones. And there was the humiliation of the defeat of liberal standard bearer Hillary Clinton in 2016.
So battered is it today that “liberal” is giving way to “progressive,” the label of choice for so many of the men and women of the class of 2018 of the House of Representatives. Perhaps, one hundred years is the limit to the shelf life of a major American political label which would mean “liberal” has reached the end of the line – time to give it a rest and go back to Samuel Johnson’s definition?

Conservative Semantics

Conservatism as a political philosophy traces its roots to the late 18th century: its intellectual leaders were the Anglo-Irish Member of Parliament Edmund Burke and the Scottish economist and philosopher Adam Smith.
In his speeches and writings, Burke extolled tradition, the “natural law” and “natural rights”; he championed social hierarchy, an established church, gradual social change and free markets; he excoriated the French Revolution in his influential pamphlet Reflections on the Revolution in France, a defense of monarchy and the institutions that protect good social order.
Burke is also well known in the U.S. for his support for the colonists in the period before the American Revolution notably in his Speech on Conciliation with the Colonies (1775) where he alerts Parliament to the “fierce spirit of liberty” that characterizes Americans.
Adam Smith, a giant figure of the Scottish Enlightenment, was the first great intellectual champion of laissez-faire capitalism and author of the classic The Wealth of Nations (1776).
Burke and Smith formed a mutual admiration society. According to a biographer of Burke, Smith thought that “on subjects of political economy, [Burke] was the only man who, without communication, thought on these subjects exactly as he did”; Burke, for his part, called Smith’s opus “perhaps the most important book ever written.” Their view of things became the standard one for conservatives throughout the 19th century and well into the 20th.
However, there is an internal inconsistency in traditional conservatism. The problem is that, in the end, laissez-faire capitalism upends the very social structures that traditional conservatism seeks to maintain. The catch-phrase of the day among pundits has become “creative destruction”; this formula, coined by the Austrian-American economist Joseph Schumpeter, captures the churning of capitalism which systematically creates new industries and new social institutions that replace the old – e.g. Sears by Amazon, an America of farmers by an America of city dwellers. Marx argued that capitalism’s failures would lead to its demise; Schumpeter argued that capitalism has more to fear from its triumphs: ineluctably the colossal success of capitalism hollows out the social institutions and mores which nurture capitalism such as church-going and the Protestant Ethic itself. Look at Western Europe today where capitalism is triumphant but where church attendance is reduced to three events: “hatch, match and dispatch,” to put it the playful way Anglicans do.
The Midas touch is still very much with us: U.S. capitalism tends to transform every activity it comes upon into a money-making version of itself. Thus something once innocent and playful like college athletics has been turned into a lucrative monopoly: the NCAA rules over a network of plantations staffed by indentured workers and signs billion dollar television contracts. Health care, too, has been transformed into a money-making machine with lamentable results: Americans pay twice as much for doctors’ care and prescription drugs as those in other advanced industrialized countries and the outcomes are grim in comparison: infant mortality and death in childbirth are off the charts in the U.S. and life expectancy is low compared to those other countries.
On the other hand, a modern capitalist economy can work well for its citizens. We have the examples of Scandanavia and of countries like Japan and Germany. Economists like Thomas Piketty write about the “thirty glorious” years after 1945 when post WWII capitalism built up a solid, prosperous middle class in Western Europe. Add to this what is known as the “French paradox” – the French drink more than Americans, smoke more, have sex more and still live some years longer. To make things worse, their cuisine is better, their work week is shorter and they take much longer vacations – one more example of how a nation can make capitalism work in the interest of its citizenry.
In American political life, in the 1930s, the label “conservative” was grabbed by forces opposed to FDR and the New Deal. Led by Senator Josiah W. Bailey of North Carolina, democrats with some Republican support published “The Conservative Manifesto,” a document which extolled the virtues of free enterprise, limited government and the balance of power among the branches of government.
In the post-war period the standard bearer of conservatism in the U.S. was Republican Senator Robert Taft of Ohio who was anti-New-Deal, anti-union, pro-business and who, as a “fiscal conservative,” stood for reduced government spending and low taxes; he also stood for a non-interventionist foreign policy. His conservatism harked back to Burke’s ideals of community: he supported Social Security, a minimum wage, public housing and federal aid to public education.
However, the philosophy of the current “conservative” political leadership in the U.S. supports all the destructive social Darwinism of laissez-faire capitalism, reflecting the 17th century English philosopher Thomas Hobbes and his dystopian vision much more than either Burke or Smith. Contemporary “conservatism” in the U.S. is hardly traditional conservatism. What happened? Mystère.
A more formal manifesto of Burkean conservatism, The Conservative Mind, was published in 1953 by Russell Kirk, then a professor at Michigan State. But conservative thought was soon co-opted and transformed by a wealthy young Texan whose family money came from oil prospecting – in Mexico and Venezuela! William F. Buckley, like Kirk a Roman Catholic, was founder and long time editor-in-chief of the seminal conservative weekly The National Review. Buckley is credited with (or accused of) transforming traditional Burkean conservatism into what goes by the name of “conservatism” in the U.S. today; he replaced the traditional emphasis on community with his libertarian view point “individualism” and replaced Taft’s non-interventionism with an aggressive Cold War political philosophy – the struggle against godless communism became the great moral cause of the “conservative movement.” For a portrait of the man, click HERE .
To his credit, Buckley kept his distance from fringe groups such as the John Birch Society; Buckley also eschewed Ayn Rand and her hyper-individualistic, atheistic philosophy of Objectivism; a man of letters himself, Buckley was likely appalled by her wooden prose – admittedly Russian and not English was her first language, but still she was no Vladimir Nabokov. On the other hand, Buckley had a long friendship with Norman Mailer, the literary icon from Brooklyn, the opposite of Buckley in almost every way.
Buckley as a cold war warrior was very different from libertarians Ron Paul and Rand Paul who both have an isolationist philosophy that opposes military intervention. On the other hand, all three have expressed similarly eccentric views on race presumably justified by their shared libertarian concept of the right of individuals to do whatever they choose to do even if it includes discrimination against others. For example, Rand Paul has stated that he would have voted against the Civil Rights Act of 1964 which outlawed the Jim Crow Laws of the segregationist states “because of the property rights element … .”
In the 1960s, 70s and 80s, Buckley’s influence spread. The future president Ronald Reagan was weaned off New Deal Liberalism through reading The National Review; in turn Buckley became a supporter of Reagan and they appeared together on Buckley’s TV program The Firing Line. The “conservative movement” was also propelled by ideologues with deep pockets and long term vision like the Koch brothers – for a interesting history of all this, see Nancy MacLean’s Democracy in Chains.
To the Buckley conservatives today, destruction of social institutions, “creative” or otherwise, is somehow not a problem and militarism is somehow virtuous.
As for destruction, among the social structures that have fallen victim recently to creative destruction is the American middle class itself, as income inequality has grown apace. This process began at the tail end of the 1960s and has been accelerating since Ronald Reagan’s presidency as Keynesian economics has given way to “supply side” economics; moreover, the guardrails for capitalism imposed by the New Deal have come undone: the Glass-Steagall Act has been repealed, the labor movement has been marginalized, and high taxes on the wealthy have become a thing of the past – contrast this with the fact that Colonel Tom Parker, the manager of Elvis Presley, considered it his patriotic duty to keep The King in the 90% tax bracket back in the day.
As for militarism, despite VE Day and VJ Day, since the 1950s, the U.S. has been engaged in an endless sequence of wars – big (Korea) and small (Grenada), long (Vietnam) and short (the First Gulf War), visible (Afghanistan) and invisible (Niger), loud (Iraq) and quiet (Somalia), … . All of which has created a situation much like the permanent state of war of Orwell’s 1984.
Moreover, since Buckley’s time, the American “conservatives” have even moved further right: reading Ayn Rand (firmly atheist and pro-choice though she was) in high-school or college has become a rite of passage, e.g. ex-Speaker Paul Ryan. An interventionist, even war-mongering, wing of the “conservative movement” has emerged, the “neo-conservatives” or “neo-cons.” Led by Dick Cheney, they were the champions of George W. Bush’s invasion of Iraq and they applaud all troop “surges” and new military interventions.
As David Brooks recently pointed out in his New York Times column (Nov. 16, 2018), the end of the Cold War deprived the “conservative movement” of its great moral cause, the struggle against godless communist collectivism. And what was a cause has morphed into expensive military adventurism. Indeed, the end of the Cold War failed to yield a “peace dividend” and the military budget today threatens the economic survival of the nation – the histories of France, Spain and many other countries bear witness to how this works itself out, alas! In days of yore, it would have been the fiscal restraint of people known as conservatives that kept government spending in check; today “conservative” members of Congress continue to sound like Robert Taft on the subject of government spending when attacking programs sponsored by their opponents, but they do not hesitate to drive the national debt higher by over-funding the military and pursuing tax cuts for corporations and the wealthy. Supply-side economics cleaves an ever widening income gap, the least conservative social policy imaginable. Then too these champions of the free market and opponents of government intervention rushed to bail out the big banks (but not the citizens whose homes were foreclosed on) during the Great Recession of 2008. All this leads one to think that this class of politicians is serving its donor class and not the working class, the middle class, the upper middle class or even much of the upper class.
Perhaps semantic rock bottom is reached when “conservative” members of Congress vote vociferously against any measure for environmental conservation. But this is predictable given the lobbying power of the fossil fuel industry, a power so impressive that even the current head of the Environmental Protection Agency is a veteran lobbyist for Big Coal. Actually for these conservatives, climate change denial is consistent with their core beliefs: fighting the effects of global warming effectively will require large-scale government intervention, significantly increased regulation of industry and agriculture as well as binding international agreements – all of which are anathema to conservatives in the U.S. today.
Still there where the word “conservative” is most misapplied is in matters judicial. One speaks today of “conservative majorities” on the Supreme Court but these majorities have proved themselves all too ready to rewrite laws and overturn precedent in 5-4 decisions in an aggressive phase of judicial activism.
So for those who fear that corruption of its language is dangerous for the U.S. population, this is the worst of times: “liberal” which once designated a proponent of Gilded Age laissez-faire capitalism is now claimed by the heirs of the New Deal and the Great Society; “conservative” which once designated a traditionalist is now the label for radical activists both political and judicial. “Liberal” is yielding to “progressive” now. However, the word “conservative” has a certain gravitas to it and “conservatism” has taken on the trappings of a religious movement complete with patron saints like Ronald Reagan and Margaret Thatcher; “conservative” is likely to endure, self-contradictory though it has become.

The Roberts Court

In 2005, with the retirement of Justice Rehnquist, John Roberts was named to the position of Chief Justice by Republican president George W. Bush. Another change in Court personnel occurred in 2008 when Sandra Day O’Connor retired and was replaced by Justice Samuel Alito. With Roberts and Alito, the Court had an even more solid “conservative” majority than before – the result being that more than ever in a 5-4 decision a justice’ vote would be determined by the party of the president who appointed him or her.

It was Ronald Reagan who named the first woman justice to the Supreme Court with the appointment of Sandra Day O’Connor in 1981. It was also Ronald Reagan who began the practice of Republican presidents’ naming ideological, conservative Roman Catholics to the Supreme Court with the appointment of Antonin Scalia in 1986. This practice on the part of Republican presidents has indeed been followed faithfully as we have to include Neil Gorsuch in this group of seven – though an Episcopalian today, Gorsuch was raised Catholic, went to parochial school and even attended the now notorious Georgetown Prep. Just think: with Thomas and Gorsuch already seated, the Brett Kavanaugh appointment brings the number of Jesuit trained justices on the Court up to three; this numerologically magic number of men trained by an organization famous for having its own adjective, plus the absence of true WASPs from the Supreme Court since 2010, plus the fact that all five of the current “conservative” justices have strong ties to the cabalistic Federalist Society could all make for an interesting conspiracy theory – or at least the elements of a Dan Brown novel.

It is said that Chief Justice Roberts is concerned about his legacy and does not want his Court to go down in history as ideological and “right wing.” However, this “conservative” majority has proven radical in their 5-4 decisions, decisions for which they then have full responsibility.

They have put gun manufacturers before people by replacing the standard interpretation of the 2nd Amendment that went back to Madison’s time with a dangerous one by cynically appealing to “originalism” and claiming the authority to speak for Madison and his contemporaries (District of Columbia v. Heller 2008)

Indeed, with Heller there was no compelling legal reason to play games with the meaning of the 2nd Amendment – if the over 200 years of interpretation of the wording of the amendment isn’t enough, if the term “militia” isn’t enough, if the term “bear arms” isn’t enough to link the amendment to matters military in the minds of the framers, one can consult James Madison’s original text:

    “The right of the people to keep and bear arms shall not be infringed; a well armed, and well regulated militia being the best security of a free country: but no person religiously scrupulous of bearing arms, shall be compelled to render military service in person.” [Italics added].

The italicized clause was written to reassure Quakers and other pacifist religious groups that the amendment was not forcing them to serve in the military, but it was ultimately excluded from the final version for reasons of separation of church and state. This clause certainly indicates that the entirety of the amendment, in Madison’s view, was for the purpose of maintaining militias: Quakers are not vegetarians and do use firearms for hunting. Note too that Madison implies in this text and in the shorter final text as well that “the right to bear arms” is a collective “right of the people” rather than an individual right to own firearms.

The radical ruling in Heller by the five “conservative” justices has stopped all attempts at gun control, enriched gun manufacturers, elevated the National Rifle Association to the status of a cult and made the Court complicit in the wanton killings of so many.

The “conservative” majority of justices has overturned campaign finance laws passed by Congress and signed by the President by summoning up an astonishing, ontologically challenged version of the legal fiction that corporations are “persons” and imbuing them with new First Amendment rights (Citizens United v. FEC 2008).

Corporations are treated as legal “persons” in some court matters, basically so that they can pay taxes and so that the officers of the corporation are not personally liable for a corporation’s debts. But, there was no compelling legal reason to play Frankenstein in Citizens United and create a new race of corporate “persons” by endowing corporations with a human-like right to free speech that allows them to spend their unlimited money on U.S. political campaigns; this decision is the first of the Roberts Court’s rulings to make this list of all-time worst Supreme Court decisions, a list (https://blogs.findlaw.com/supreme_court/2015/10/13-worst-supreme-court-decisions-of-all-time.html ) compiled for legal professionals. It has also made TIME magazine’s list of the two worst decisions in the last 60 years and likely many other such rankings. The immediate impact of this decision has been a further gap between representatives and the people they are supposed to represent; the political class was at least somewhat responsive to the voters, now they are only responsive to the donor class. This likely works well for the libertarians and conservatives who boast “this is a Republic, not a Democracy.”

These same five justices have continued their work by

  • usurping Congress’ authority and undoing hard-fought for minority protections from the Voting Rights Act by adventuring into areas of history and politics that they clearly do not grasp and basing the decision on a disingenuous view of contemporary American race relations (Shelby County v. Holder 2013),
  • doubling down on quashing the Voting Rights Act five years later in a decision that overturned a lower court ruling that Texas’ gerrymandered redistricting map undercut the voting power of black and Hispanic voters (Texas Redistricting Case 2018)
  • breaching the separation of Church and State by ascribing “religious interests” to companies in a libertarian judgment that can justify discrimination in the name of person’s individual freedoms, the “person” in this case being a corporation no less (Burwell v. Hobby Lobby Stores 2014)
  • gravely wounding the labor movement by overturning the Court’s own ruling in a 1977 case, Abood v. Detroit Board of Education, thus undoing years of established Labor Law practice (Janus v. AFSCME 2018) – a move counter to the common law principle of following precedent.

These six decisions are examples of “self-inflicted wounds” on the part of the Roberts Court and can be added to a list begun by Chief Justice Charles Evans Hughes, a list that begins with Dred Scott. The recent accessions of Neil Gorsuch and Brett Kavanaugh to seats on the Court may well make for even more decisions of this kind.

This judicial activism is indeed as far away from the dictionary meaning of “conservatism” as one can get. Calling these activist judges “conservative” makes American English a form of the “Newspeak” of Orwell’s 1984. The Court can seem to revel in its arrogance and its usurpation of power: Justice Scalia would dismiss questions about Bush v. Gore with “Get over it” – a rejoinder some liken to “Let them eat cake” – and he refused to recuse himself in cases involving Dick Cheney, his longtime friend (Bush v. Gore, Cheney v. U.S. District Court).

The simple fact is that the courts have become too politicized. The recent fracas between the President and the Chief Justice where the President claimed that justices’ opinions depended on who appointed them just makes this all the more apparent.

Pundits today talk endlessly on the topic of how “we are headed for a constitutional crisis,” in connection with potential proceedings to impeach the President. But we are indeed in a permanent constitutional crisis in any case. Thus, there is a clear majority in the country that wants to undo the results of Citizens United and of Heller in particular – both are decisions which shot down laws enacted by elected representatives. Congressional term limits are another example; in 1995 with U.S. Term Limits Inc. v. Thornton, the Court nullified 23 state laws instituting term limits for members of Congress, thereby declaring that the Constitution had to be amended for such laws to pass judicial review.

In the U.S. the Congress is helpless when confronted with this kind of dilemma; passing laws cannot help since the Court has already had the final say, a true Catch-22. This is an American constitutional problem, American Exceptionalism gone awry. In England and Holland, for example, the courts cannot apply judicial review to nullify a law; in France, the Conseil Constitutionel has very limited power to declare a law unconstitutional; this was deliberately engineered by Charles de Gaulle to avoid an American style situation because per DeGaulle “la [seule] cour suprême, c’est le peuple” (The only supreme court is the people).

So what can the citizen majority do? The only conceivable recourse is to amend the Constitution; but the Constitution itself makes that prospect dim since an amendment would require approval by a supermajority in both houses of Congress followed by ratification by two-thirds of the states; moreover, states with barely 4% of the population account for over one-third of the states, making it easy and relatively inexpensive for opposition to take hold – e.g. the Equal Rights Amendment. Also, those small, rural states’ interests can be very different from the interests of the large states – one reason reform of the Electoral College system is an impossibility with the Constitution as it is today. Only purely bureaucratic measures can survive the amendment ratification process. Technically, there is a second procedure in Article V of the Constitution where a proposal to call a Constitutional Convention can be initiated by two-thirds of the states but then approval of an amendment by three-fourths of the states is required; this procedure has never been used. Doggedly, the feisty group U.S. Term Limits (USTL), the losing side in that 1995 decision, is trying to do just that! For their website, click https://www.termlimits.com/  .

What has happened is that the Constitution has been gamed by the executive and judicial branches of government and the end result is that the legislative branch is mostly reduced to theatrics. Thus, for example, while the Congress is supposed to have the power to make war and the power to levy tariffs, these powers have been delegated to the office of the President. Even the power to make law has, for so many purposes, been passed to the courts where every law is put through a legal maze and the courts are free to nullify the law or change its meaning invoking the interpretation du jour of the Constitution and/or overturning legal precedents, all on an as needed basis.

This surge in power of the judiciary was declared to be impossible by Alexander Hamilton in his Federalist 78 where he argues approvingly that the judiciary will necessarily be the weakest branch of the government under the Constitution. But oddly no one is paying attention to the rock-star founding father this time. For example, this Federalist Paper of Hamilton’s is sacred scripture for the assertive Federalist Society but they seem silent on this issue – but that is not surprising given that they have become the gatekeepers for Republican presidents’ nominees for the Supreme Court.

Americans are simply blinded to problems with the Constitution by the endless hymns in its praise in the name of American Exceptionalism. Many in Europe also argue that the way the Constitution contributes to inaction is a principal reason that voter participation in the U.S. is far lower than that in Europe and elsewhere. Add to that, the American citizen’s well founded impression that it is the money of corporations, billionaires and super-PACs in cahoots with the lobbyists of the Military Industrial Complex, Big Agriculture, Big Pharma, Big Oil and Big Banks that runs the show and you have a surefire formula to induce voter indifference. Even the improved turnout in the 2018 midterm elections was unimpressive by international standards.

This is not a good situation; history has examples of what happens when political institutions are no longer capable of running a complex nation – the French Revolution, the fall of the Roman Republic, the rise of Fascism in post WWI Europe … .

Bush v. Gore

In 1986, when Warren Burger retired, Ronald Reagan promoted Associate Justice William Rehnquist to the position of Chief Justice and nominated Antonin Scalia to fill Rehnquist’s seat. This created a solid conservative kernel on the Court consisting of the five justices Rehnquist, Scalia, Thomas, O’Connor and Kennedy; there was also Justice John Paul Stevens (appointed by Gerald Ford) who was considered a “moderate conservative.” On occasion O’Connor or Kennedy could become a swing vote and turn things in another direction and Stevens too voted against the conservative majority on some important decisions.
While more conservative than the Burger Court, the Rehnquist Court did not overthrow the legacy of the Warren Court; on the other hand, it promoted a policy of “New Federalism” which favored empowering the states rather than the federal government.
This philosophy was applied in two cases that weakened Roe v. Wade, the defining ruling of the Burger Court.
Thus in Webster v. Reproductive Health Services (1989), the Court upheld a Missouri law that restricted the way state funds could be used in connection with counseling and other aspects of abortion services; this ruling allowed states to legislate in ways thought to have been ruled out by Roe.
As a second example, we have their ruling in Planned Parenthood v. Casey (1992) which also weakened Roe by giving much more power to the states to control access to abortion. Thus today in states like Mississippi, there is virtually no such access. All this works against the poor and the less affluent as women need to travel far, even out of state, to get the medical attention they seek.
Then the Rehnquist Court delivered one of the most controversial, politicized decisions imaginable with its ruling in Bush v. Gore (2000). With this decision, the Court came between a state supreme court and the state’s election system and hand-delivered the presidency to Republican George W. Bush.
After this case, the Court made other decisions that generated some controversy, but in these it came down, relatively speaking, on the liberal side in ruling on anti-sodomy laws, on affirmative action and on election finance. However, Bush v. Gore is considered one of the worst Supreme Court decisions of all time. For a list that includes this decision, Dred Scott, Plessy v. Ferguson and ten others, click HERE ; for a TIME magazine piece that singles it out as one of the two worst decisions since 1960 (along with Citizens United v. FEC), click HERE .
Naturally, the 5-4 decision in Bush v. Gore by the Court’s conservative kernel is controversial because of the dramatic end it put to the 2000 presidential election. There are also legal and procedural aspects of the case that get people’s dander up.
To start there is the fact that in this decision the Court overruled a state supreme court on the matter of elections, something that the Constitution itself says should be left to the states.
For elections, Section 4 of Article 1 of the U.S. Constitution leaves the implementation to the states to carry out, in the manner they deem fit – subject to Congressional oversight but not to court oversight:
    “The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations, except as to the Places of chusing (sic) Senators.”
N.B. In the Constitution, the “Senators” are an exception because at that time the senators were chosen by the state legislatures and direct election of senators by popular vote did not come about until 1913 and the 17th Amendment.
From the time of the Constitution, voting practices have varied from state to state. In fact, at the outset free African-Americans with property could vote in Maryland and that lasted until 1810; women of property could vote in New Jersey until 1807; in both cases, the state legislatures eventually stepped in and “restored order.”
In the Constitution, it is set out that the electors of the Electoral College should be named by each state legislature and not voted for by the people at all – Hamilton and Madison were most fearful of “mob rule.” The only founding father who expressed some admiration for the mass of U.S. citizenry was, not surprisingly, Jefferson who famously asserted in a letter to Lafayette that “The yeomanry of the United States are not the canaille [rabble] of Paris.”
Choosing Electors by popular vote was established nation-wide, however, by the 1820s; by Section 4 of Article 1 above, it is each state’s responsibility to implement its own system for choosing electors; there is no requirement for things to be uniform. In fact, today Nebraska and Maine use congressional district voting to divide up their electors among the candidates while all the other states use plurality voting where the presidential candidate with the most votes is awarded all the electoral votes from that state. In yet another break with the plurality voting system that the U.S. inherited from England, the state of Maine now employs ranked choice voting to elect Congressional representatives – in fact, in 2018 a candidate in a House Congressional race in Maine with fewer first place votes but a larger total of first and second place votes emerged the victor in the second round of the instant runoff.
So from a Constitutional point of view, the Supreme Court really did not have the authority to take the case Bush v. Gore on. In his dissent, Justice John Paul Stevens decried this usurpation of state court power:
    [The court displayed] “an unstated lack of confidence in the impartiality and capacity of the state judges who would make the critical decisions if the vote count were to proceed”.
Moreover, Sandra Day O’Connor said as much when some years later in 2013 she expressed regret over her role in Bush v. Gore telling the Chicago Tribune editorial board: “Maybe the court should have said, ‘We’re not going to take it, goodbye.’ ”
Taking this case on added to the Courts history of “self inflicted wounds” to use the phrase Chief Justice Charles Evans Hughes applied to bad decisions the Court just did not have to make the way they did for any compelling legal reason.
The concurring justices admitted that their decision was not truly a legal ruling but rather an ad hoc way of making a problem go away when they said that the ruling in Bush v. Gore should not be considered a precedent for future cases:
    “Our consideration is limited to the present circumstances, for the problem of equal protection in election processes generally presents many complexities.”
Another odd thing was that the ruling did not follow the usual practice of having one judge write the deciding opinion with concurring and dissenting opinions from the other justices. Instead, they followed a technique known as a per curiam decision which is usually reserved for 4-4 hung court when no actual decision is being made. It is a technique for dodging responsibility for the decision and not assigning credit for a decision to a particular justice. As another example of how this method of laying down a decision is employed, in the state of Florida their Supreme Court often issues per curiam decisions in death penalty cases.
Borrowing a trope from the Roman orator Cicero, we pass over in silence the revelation that three of the majority judges in this case had reason to recuse themselves by not mentioning the fact that Justice Thomas’ wife was very active in the Bush transition team even as the case was before the Court, by leaving out the fact that Justice Scalia’s son was employed by the very law firm that argued Bush’s case before the Court, by omitting the fact that Justice Scalia and vice-presidential candidate Dick Cheney were longtime personal friends and by skipping over the fact that according to The Wall Street Journal and Newsweek, Justice O’Connor had previously said that a Gore victory would be a disaster for her because she would not want to retire under a Democratic president!
For an image of Cicero practicing his craft before an enthralled Roman Senate, click HERE .
So we limit ourselves to quoting Harvard Professor Alan Dershowitz who summed things up this way:
    “The decision in the Florida election case may be ranked as the single most corrupt decision in Supreme Court history, because it is the only one that I know of where the majority justices decided as they did because of the personal identity and political affiliation of the litigants. This was cheating, and a violation of the judicial oath.”
Another villain in the piece is Governor Jeb Bush of Florida whose voter suppression tactics implemented by Secretary of State Katherine Harris disenfranchised a significant number of voters. In the run up to this election, according to the Brennan Center for Justice at NYU, some 4,800 eligible African-American Florida voters were wrongly identified as convicted felons and purged from the voting rolls. Given that 86% of African-American voters went for Gore over Bush in 2000, one can do the math and see that Gore would likely have won if but 20% of these African-American voters had been able to cast ballots.
Yet another villain in the piece and in the recurring election problems in Florida is the plurality voting system that the state uses to assign all its votes for its electors to the candidate who wins the most votes (but not necessarily the majority of votes). This system works poorly in cases where the elections are as tight as again and again they prove to be in Florida. In 2000, had Florida been using ranked-choice voting (to account for votes for Nader and Buchanan) or congressional district voting (as in Maine and Nebraska), there would have no recount crisis at all – and either way Gore would in all probability have won enough electoral votes to secure the presidency and the matter never would have reached the Supreme Court.
Sadly, the issues of the presidential election in Florida in 2000 are still very much with us – the clumsiness of plurality voting when elections are close, the impact of voter suppression, antiquated equipment, the role of the Secretary of State and the Governor in supervising elections, … . The plot only thickens.

The Warren Court, Part B

In the period from 1953-1969, Earl Warren became the most powerful Chief Justice since John Marshall as he led the Court through a dazzling series of rulings that established the judiciary as a more than equal partner in government – an outcome deemed impossible by Alexander Hamilton in his influential paper Federalist 78 and an outcome deemed undesirable for the separation of powers in government by Montesquieu in his seminalThe Spirit of the Laws. The impact of this Court was so dramatic that it provoked a nationwide call among conservatives for Warren’s impeachment. (Click  HERE).
As the Cold War intensified, the historical American separation of Church and State was compromised. In the name of combating godless communism, the national motto was changed! From the earliest days of the Republic, the motto had been “E Pluribus Unum,” which is the Latin for “Out of Many, One”; this motto was adopted by an Act of Congress under the Articles of Confederation in 1782. In 1956, the official motto became “In God We Trust” and that text now appears on all U.S. paper currency. Shouldn’t they at least have asked what deist leaning Washington, Jefferson, Franklin and Hamilton would have thought before doing this – after all their pictures are on the bills?
Spurred on by the fact that the phrase “under God” appears in most versions of the Gettysburg Address,
          that this nation, under God, shall have a new birth of freedom
groups affiliated with organized religion like the Knights of Columbus successfully campaigned for Congress to insert this phrase into the Pledge of Allegiance; this was done in 1954. Interestingly, “under God” does not appear in Lincoln’s written text for the cemetery dedication speech but was recorded by listeners who were taking notes. Again, this insertion in the Pledge was justified at the time by need to rally the troops in the struggle against atheistic communism. In particular, in the Catholic Church a link was established between the apparitions of The Virgin at Fatima in the period from May to October of 1917 and the October Revolution in Russia in 1917 – though the Revolution actually took place on November 6th and 7th in the Gregorian Calendar; with the Cold War raging, the message of Fatima became to say the rosary for the conversion of Russia, a directive that was followed fervently by the laity, especially school children during the 1950s and 1960s. In addition, the “Second Secret” of Our Lady of Fatima was revealed to contain the line “If my requests are heeded, Russia will be converted, and there will be peace.” When the Soviet Union fell at the end of 1991, credit was ascribed to Ronald Reagan and other political figures; American Catholics of a certain age felt slighted indeed when their contributing effort went unrecognized by the general public!!
The course was reversed somewhat with the Warren Court’s verdict in Engle v. Vitale (1962) when the Court declared that organized school prayer violated the separation of Church and State. A second (and better known) decision followed in Abington School District v. Schempp (1963) where the Court ruled that official school Bible reading also violated the separation of Church and State. This latter case is better known in part because it involved the controversial atheist Madalyn Murray O’Hair who went on to make an unsuccessful court challenge to remove “In God We Trust” from U.S. paper currency. Ironically, the federal courts that thwarted this effort cited Abington School District where Justice Brennan’s concurring opinion explicitly stated that “the motto” was simply too woven into the fabric of American life to “present that type of involvement which the First Amendment prohibits.” In the U.S., “God” written with a capital “G” refers specifically to the Christian deity; so a critic deconstructing Brennan’s logic might argue that Brennan concedes that worship of this deity is already an established religion here.
The Warren Court also had a significant impact on other areas of rights and liberties.
With Baker v. Carr (1962) and Reynolds v. Sims (1964), the Court codified the principle of “one man, one vote.” In the Baker case, the key issue was whether state legislative redistricting was a matter for state and federal legislatures or whether it came under the authority of the courts. Here, the Court overturned its own decision in Colegrove v. Green (1946) where it ruled that such redistricting was a matter for the legislatures themselves with Justice Frankfurter declaring “Courts ought not to enter this political thicket.” The majority opinion in the Baker ruling was written by Justice Brennan; Frankfurter naturally dissented. In any case, this was a bold usurpation of authority on the part of the Supreme Court, something hard to undo even should Congress wish to do so. Again we are very far from Marbury v. Madison; were that case to come up today, one would be very surprised if the Supreme Court didn’t instruct Secretary of State Madison to install Marbury as Justice of the Peace in Washington D.C.
With Gideon v. Wainwright (1963) the Court established the accused’s right to a lawyer in state legal proceedings. This right is established for defendants vis-à-vis the federal government by the Bill of Rights with the Fifth and Sixth Amendments; this case extended that protection to defendants in dealings with the individual states.
With Miranda v. Arizona (1966), it mandated protection against self-incrimination – the “Miranda rights” that a plaintiff must be informed of. A Virginia state law banning interracial marriage was struck down as unconstitutional in Loving v. Virginia (1967), a major civil rights case on its own.
The Gideon and Miranda rulings were controversial, especially Miranda, but they do serve to protect the individual citizen from the awesome power of the State, very much in the spirit of the Bill of Rights and of the Magna Carta; behind the Loving case is an inspiring love story and, indeed, it is the subject of a recent motion-picture.
Warren’s legacy is complex. On the one hand, his Court courageously addressed pressing issues of civil rights and civil liberties, issues that the legislative and executive branches would not deal with. But by going where Congress feared to tread, the delicate balance of the separation of powers among the three branches of government has been altered, irreparably it appears.
The Warren Court (1953-1969) was followed by the Burger Court (1969-1986).
Without Earl Warren, the Court quickly reverted to making decisions that went against minorities. In San Antonio Independent School District v. Rodriguez (1973), the Court held in a 5-4 decision that inequities in school funding did not violate the Constitution; the ruling implied that discrimination against the poor is perfectly compatible with the U.S. Constitution and the right to an education is not a fundamental right. This decision was based on the fact that the right to an education does not appear in the Constitution echoing the logic of Marbury. Later some plaintiffs managed to side-step this ruling by appealing directly to state constitutions. We might add that all 5 concurring justices in this 5-4 ruling were appointed by a Republican president – a pattern that is all too common today; judicial activism fueled by political ideology is a dangerous force.
The following year in another 5-4 decision Milliken v. Bradley (1974), the Court further weakened Brown by overturning a circuit court’s ruling. With this ruling, the Court scratched a plan for school desegregation in the Detroit metropolitan area that involved separate school districts, thus preventing the integration of students from Detroit itself with those of adjacent suburbs like Grosse-Pointe. The progressive stalwarts Marshall, Douglas and Brennan were joined by Byron White in their dissent; the 5 concurring justices were all appointed by Republican presidents. The decision cemented into place the pattern of city schools with black students and surrounding suburban schools with white students.
The most controversial decision made by the Burger Court was Roe v. Wade (1973). This ruling invoked the Due Process Clause of the 14th Amendment and established a woman’s right to privacy as a fundamental right and declared that abortion could not be subject to state regulation until the third trimester of pregnancy. Critics, including Ruth Bader Ginsburg, have found fault with the substance of the decision and its being “about a doctor’s freedom to practice his profession as he thinks best…. It wasn’t woman-centered. It was physician-centered.” A fresh attempt to overturn Roe and subsequent refinements such as Planned Parenthood v. Casey (1992) is expected, given the current ideological makeup of the conservative majority on the Court and the current Court’s propensity to overturn even recent rulings.
Today to the overweening power of the Court has been added a political dimension in that in 5-4 decisions there continues to be, with rare exceptions, that direct correlation between a justice’s vote and the party of the president who appointed that justice. To that add the blatantly partisan political shenanigans we have seen on the part of the Senate Majority leader in dealing with Supreme Court nominations and add the litmus test provided by the conservative/libertarian Federalist Society. The plot thickens. Affaire à suivre.

The Warren Court, Part A

Turning to the courts when the other branches of government would not act was the technique James Otis and the colonists resorted to in the period before the American Revolution, the period when the Parliament and the Crown would not address “taxation without representation.” Like the colonists, African-Americans had to deal with a government that did not represent them. Turning to the courts to achieve racial justice and to bring about social change was then the strategy developed by the NAACP. However, for a long time, even victories in the federal courts were stymied by state level opposition. For example, Guinn v. United States (1915) put an end to one “literacy test” technique for voter suppression but substitute methods were quickly developed.
In the 1950’s, the Supreme Court finally undid the post Civil War cases where the Court had authorized state level suppression of the civil rights of African-Americans – e.g. the Slaughterhouse Cases (1873), the Civil Rights Cases (1875), Plessy v. Ferguson (1896); these were the Court decisions that callously rolled back the 13th, 14th and 15th amendments to the Constitution and locked African-Americans into an appalling system.
The first chink in Plessy was made in a case brilliantly argued before the Supreme Court by Thurgood Marshall, Sweatt v. Painter (1950). The educational institution in this case was the University of Texas Law School at Austin, which at that time actually had a purportedly equal but certainly separate school for African-American law students. The Court was led by Kentuckian Fred M. Vinson, the last Chief Justice to be appointed by a Democratic president – in this case Harry Truman! Marshall exposed the law school charade for the scam that it was. Similarly and almost simultaneously, in McLaurin v. Oklahoma State Regents for Higher Education, the Court ruled that the University of Oklahoma could not enforce segregation in classrooms for PhD students. In these cases, the decisions invoked the Equal Protection Clause of the 14th Amendment; both verdicts were unanimous.
These two important victories for civil rights clearly meant that by 1950 things were starting to change, however slowly – was it World War II and the subsequent integration of the military? Was it Jackie Robinson, the Brooklyn Dodgers and the integration of baseball? Was it the persistence of African-Americans as they fought for what was right? Was it the Cold War fear that U.S. racial segregation was a propaganda win for the international Communist movement? Was it the fear that the American Communist party had gained too much influence in the African-American community – indeed Langston Hughes, Paul Robson and other leaders had visited the Soviet Union; the leading scholar in the U.S. of African-American history was Herbert Aptheker, a card-carrying member of the Communist Party. Or was it an enhanced sense of simple justice on the part of nine “old white men”?
The former Republican Governor of California, Earl Warren, was named to succeed Vinson in 1953 by President Dwight D. Eisenhower. The Warren Court would overturn Plessy and other post Civil War decisions that violated the civil rights of African-Americans and go on to use the power of the Court in other areas of political and civil liberties. This was a period of true judicial activism. Experienced in government, Warren saw that the Court would have to step in to achieve important democratic goals that the Congress was unwilling to act on. Several strong, eminent jurists were part of this Court. There were the heralded liberals William O. Douglas and Hugo Black. There was Viennese-born Felix Frankfurter, a former Harvard Law Professor and a co-founder of the ACLU; Frankfurter was also a proponent of judicial restraint which strained his relationship with Warren over time as bold judgments were laid down. For legal intricacies, Warren relied on William J. Brennan, another Eisenhower appointee but a friend of Labor and a political progressive. Associate Justice John Marshall Harlan, the grandson and namesake of the sole dissenter in Plessy, was the leader of the conservative wing.
Perhaps, the most well-known of the Warren era cases is Brown v. Board of Education, which grouped several civil rights suits that were being pursued by the NAACP and others; the ruling in this case, which like Sweatt was again based on the Equal Protection Clause, finally undid Plessy. This case too was argued before the Court by Thurgood Marshall.
Brown was followed by several other civil rights cases which ended legal segregation in other aspects of American life. Moreover, when school integration was not being implemented around the country, with the case Brown v. Board of Education II (1955), the Court ordered schools to desegregate “with all deliberate speed”; this elusive phrase proved troublesome. It was introduced in the 1912 decision in Virginia v. West Virginia by Court wordsmith Oliver Wendell Holmes Jr. and it was used in Brown II at the behest of Felix Frankfurter, that champion of judicial restraint. This decision was 9-0 as it was in all the Warren Court’s desegregation cases, something that Warren considered most important politically.
With Brown II and other cases, the Court ordered states and towns to carry out its orders. This kind of activism is patently inconsistent with the logic behind Marbury v. Madison where John Marshall declared that the Court could not order anything that was not a power it was explicitly given in the Constitution, not even something spelt out in an act of Congress. No “foolish consistency” to worry about here.
However, school desegregation hit many obstacles. The resistance was so furious that Prince Edward County in Virginia actually closed its schools down for 5 years to counter the Court’s order; in Northern cities like Boston, enforced busing led to rioting; Chris Rock “jokingly” recounts that in Brooklyn NY he was bused to a neighborhood poorer than the one he lived in – and he was beaten up every day to boot.
The most notorious attempt to forestall the desegregation ruling took place in Little Rock, AR in September, 1957. Nine (outstanding) African-American students had been chosen to enroll in previously all white Central High School. The governor, Orval Faubus, actually deployed National Guard troops to assist segregationists in their effort to prevent these students from attending school. President Dwight D. Eisenhower reacted firmly; the Arkansas National Guard was federalized and taken out of the governor’s control and the elite 101st Airborne Division of the U.S. Army (the “Screaming Eagles”) was sent to escort the nine students to class, all covered on national television:
Segregationist resistance did not stop there: among other things, the Little Rock schools were closed for the 1958-59 school year in a failed attempt to turn city schools into private schools and this “Lost Year” was blamed on African-American students. It was ugly.
The stirring Civil Rights movement of the 1950s and 1960s fought for racial equality on many fronts. It spawned organizations and leaders like SNCC (Stokely Carmichael), CORE (Roy Innis) and SCLC (Martin Luther King Jr.) and it spawned activists like Rosa Parks, John Lewis, Michael Schwerner, James Chaney and Andrew Goodman. The price was steep; people were beaten and people were murdered.
The President and Congress were forced to react and enacted the Civil Rights Act of 1964 and the Voting Rights Act of 1965. The latter, in particular, had enforcement provisions which Supreme Court decisions like Guinn had lacked. This legislation reportedly led Lyndon Johnson to predict that the once Solid South would be lost to the Democratic Party. Indeed today, the New South is comprised of “deep red” states. Ironically, it was the Civil Rights movement that made the prosperous New South possible – with segregation, companies (both domestic and international) wouldn’t relocate there or expand operations there; with segregation, the impressive Metro system MARTA of Atlanta could never have been possible; with segregation, a modern consumer economy cannot function; with segregation, Alabama wouldn’t be the reigning national football champion – and college football is a big, big business. .
Predictably, there was a severe backlash against the new legislation and already in 1964 two expedited challenges reached the Warren Court, Heart of Atlanta v. United States and Katzenbach v. McClung. Both rulings were in favor of the Civil Rights Act by means of 9-0 decisions. Interestingly, in both cases, the Court invoked the Commerce Clause of the Constitution rather than the 13th and 14th amendments basing the decision on the authority of the federal government to regulate interstate commerce rather than on civil liberties; experts warn that this could make these decisions vulnerable in the future.
The period of slavery followed by the period of segregation and Jim Crow laws lasted 346 years from 1619 to 1965. Until 1776, this repression was enforced by the English Crown and Parliament, then until the Civil War by the Articles of Confederation and the U.S. Constitution; and then until 1965 by state governments and the Supreme Court. During this time, there was massive wealth accumulation by white America, drawn in no small measure from the profits of slave labor and later the Jim Crow economy. Great universities such as the University of Virginia, Duke and Clemson owe their existence to fortunes gained through this exploitation. Recently, it was revealed that Georgetown University profited from significant slave sales in Maryland to finance its operations. In the North too, the profits from selling factory product to the slave states, to say nothing of the slave trade itself, contributed to the endowments of the great universities of the northeast. Indeed, Columbia, Brown and Harvard have publicly recognized their ties to slavery and the slave trade.  On the other hand, Europeans who arrived in the U.S. in the waves of immigration following the Civil War and their descendants were able, in large numbers, to accumulate capital and accede to home ownership and eventually to higher education. Black America was simply denied this opportunity for those 346 years and today the level of black family wealth is still appallingly low – to cite a Washington Post article of Sept. 28, 2017: “The median net worth of whites remains nearly 10 times the size of blacks’. Nearly 1 in 5 black families have zero or negative net worth — twice the rate of white families.”
It is hard to imagine how this historical injustice can ever be righted. The Supreme Court has played a nefarious role in all this from the Marshall Court’s assiduous defense of the property rights of slave owners (Scott v. London (1806), etc.) to Dred Scott to Plessy, weakening the 14th and 15th amendments en passant, enabling Jim Crow and creating the world of “Separate But Equal.” Earl Warren’s leadership was needed in the period following the Civil War but alas that is not what happened.
In addition to these celebrated civil rights cases, the Warren Court also had to take on suits involving separation of Church and State and involving protection of the individual citizen from the awesome power of the State, the very thing that made the Bill of Rights necessary. More to come. Affaire à suivre.

Business and Baseball

The twentieth century began in 1901. Teddy Roosevelt became President after William McKinley’s assassination by an anarchist at the Pan American Exposition in Buffalo NY. This would prove a challenging time for the Supreme Court and judicial review. By the end of the century the power and influence of the Court over life in America would far exceed the limits stipulated by the Baron de Montesquieu in The Spirit of the Laws or those predicted by the analysis of Alexander Hamilton in Federalist 78.
Normally, the most visible of the justices on the Court is the Chief Justice but in the period from 1902 till 1932, the one most quotable was Associate Justice Oliver Wendell Holmes Jr. Holmes Sr. was the famous physician, writer and poet, author of Old Ironsides and other entries in the K-12 canon. For his part, Holmes Jr. wrote Supreme Court decisions and dissents that have become part of the lore of the Court.
In 1905, the 5-4 Court ruled against the state of New York in one of its more controversial decisions, Lochner v. New York. Appealing to laissez-faire economics, the majority ruled that the state did not have the authority to limit bakery workers hours to 10 hours a day, 60 hours a week even if the goal was to protect the workers’ health and that of the public. The judges perverted the Due Process Clause of the 14th Amendment which reads.
    [Nor] shall any State deprive any person of life, liberty, or property, without due      process of law
They invoked this clause of a civil rights Amendment to rule that the New York law interfered with an individual baker’s right to enter into a private contract. In his dissent, Holmes attacked the decision for applying the social Darwinism of Herbert Spencer (coiner of the phrase “survival of the fittest”) to the Constitution; rather pointedly, Hughes wrote
    The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.
Over time, the anti-labor aspects of this decision were undone by legislation but its influence on the discussion of “due process” continues. It has given rise to the verb “lochnerize” which is defined thusly by Wiktionary:
    To read one’s policy preferences into the Constitution, as was (allegedly) done by the U.S. Supreme Court in the 1905 case Lochner v. New York.
The parenthetical term “alledgedly” presumably refers to Holmes’ critique. Two other contributions of Lochner to the English language are the noun “Lochnerism” and the phrase “The Lochner Era.”
In 1917, Congress passed the Espionage Act which penalized protests and actions that contested American participation in WWI. This law and its added amendments in the Sedition Act (1918) were powerful tools for suppressing dissent, something pursued quite vigorously by the Wilson administration. A challenge to the act followed quickly with Schenk v. United States (1919). The Court ruled in favor of the Espionage Act unanimously; Holmes wrote the opinion and created some oft cited turns of phrase:
    The most stringent protection of free speech would not protect a man in falsely shouting fire in a theatre and causing a panic.
    The question … is whether the words used … create a clear and present danger that .. will bring about the substantive evils that Congress has a right to prevent.
Holmes’ opinion notwithstanding, the constitutionality of the Espionage Act is still debated because of its infringement on free speech.
Schenck was then followed by another case involving the Espionage Act, Debs v. United States (1919). Eugene Debs was the union activist and socialist leader whom the Court had already ruled against in the Pullman case known as In re Debs (1895). Writing again for a unanimous court, Holmes invoked Schenck and ruled that the right of free speech did not protect protest against the military draft. Debs was sentenced to ten years in Prison and disenfranchised; that did not prevent him from running for President in 1920 – he received over 900.000 votes, more than 3% of the total.
Debs was soon pardoned by President Warren G. Harding in 1921 and even invited to the White House! The passionate Harding apparently admired Debs and did not approve of the way he had been treated by Wilson, the Espionage Act and the Court; Harding famously held that “men in Congress say things worse than the utterances” for which Debs was convicted. In 1923, having just announced a campaign to eliminate the rampant corruption in Washington, Harding died most mysteriously in the Palace Hotel in San Francisco: vampire marks on his neck, no autopsy, hasty burial – suspects ranged from a Norwegian seaman to Al Capone hit men to Harding’s long suffering wife. Harding was succeeded by Calvin Coolidge who is best remembered for his insight into the soul of the nation: “After all, the chief business of the American people is business.”
Although the Sedition Act amendments were repealed in 1921, the Espionage Act itself lumbers on. It has been used in more recent times against Daniel Ellsberg and Edward Snowden.
Today, Holmes is also remembered for his opinion in an anti-trust suit pitting a “third major league” against the established “big leagues.” The National League had been a profitable enterprise since 1876, with franchises stretching from Boston to St. Louis. At the top of the century, in 1901, the then rival American League was formed but the two leagues joined together in time for the first World Series in 1903. The upstart Federal League managed to field eight teams for the 1914 and 1915 seasons, but interference from the other leagues forced them to end operations. A suit charging the National and American leagues with violating the Sherman Anti-Trust Act was filed in 1915 and it was heard before Judge Kenesaw Mountain Landis – who, interestingly, was to become the righteous Commissioner of Baseball following the Black Sox Scandal of 1919. In Federal Court Landis dramatically slow-walked the Federal League’s case and the result was that different owners made various deals, some buying into National or American League teams and/or folding their teams into established teams; the exception was the owner of the Baltimore Terrapins franchise – the terrapin is a small turtle from Maryland, but the classic name for a Baltimore team is the Orioles; perhaps, the name still belonged to the New York Yankees organization since the old Baltimore Orioles, when dropped from the National League, joined the new American League in 1901 and then moved North to become the New York Highlanders in 1903, that name being changed to New York Yankees in 1913. Be that as it may, the league-less Terrapins continued to sue the major leagues for violating anti-trust law; this suit made its way to the Supreme Court as Federal Baseball Club v. National League.
In 1922, writing for a unanimous court in the Federal case, Holmes basically decreed that Major League Baseball was not a business enterprise engaged in interstate commerce; with Olympian authority, he wrote:
    The business [of the National League] is giving exhibitions of baseball. … the exhibition, although made for money, would not be called trade or commerce in the commonly accepted use of those words.
So, this opinion simply bypasses the Commerce Clause of the Constitution, which states that the Congress shall have power
    To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes.
With this verdict, Major League Baseball, being a sport and not a business, was exempt from anti-trust regulations such. This enabled “the lords of baseball” to continue to keep a player bound to the team that first signed that player; the mechanism for this was yet another clause, the “reserve clause” which was attached to all the players’ contracts. The “reserve clause” also allowed a team (but not a player) to dissolve the player’s contract on 10 days notice. Obviously this had the effect of depressing player salaries. It also led to outrages such as Sal “The Barber” Maglie’s being blackballed for three seasons for having played in the Mexican League and such as the Los Angeles Dodgers’ treatment of Brooklyn great, Carl “The Reading Rifle” Furillo. Interestingly, although both were truly star players, neither is “enshrined” in the Baseball Hall of Fame; Furillo, though, is featured in Roger Kahn’s classic The Boys of Summer and both Furillo and Maglie take the field in Doris Kearns Goodwin’s charming memoir Wait Till Next Year.
The Supreme Court judgment in Federal was mitigated by subsequent developments that were set in motion by All-Star outfielder Curt Flood’s courageous challenge to the “reserve clause” in the suit Flood v. St Louis (1972); this case was decided against Flood by the Supreme Court in a 5-3 decision that was based on the precedent of the Federal ruling; in this case justice Lewis Powell recused himself because he owned stock in Anheuser-Busch, the company that owned the St. Louis franchise – an honorable thing to do but something which exposes the potential class bias of the Court that might lie behind decisions favoring corporations and the powerful. Though Flood lost, there were vigorous dissents by Justices Marshall, Douglas and Brennan and his case rattled the system; the players union was then able to negotiate for free agency in 1975. However, because of the anti-trust exemption, Major League Baseball still has much more control over its domain than do other major sports leagues even though the NFL and the NCAA benefit from legislation exempting them too from some anti-trust regulations.
In 1932 when Franklin D. Roosevelt became President, the U.S. was nearly three years into the Great Depression. With the New Deal, the congress moved quickly to enact legislation that would serve both to stimulate the economy and to improve working conditions. In the First Hundred Days of the Roosevelt presidency, the National Industrial Recovery Act (NIRA) and the Agricultural Adjustment Act (AAA) were passed. Both were then declared unconstitutional in whole or in part by the Court under Chief Justice Charles Evans Hughes: the case Schechter Poultry Corp. v. United States (1935) was brought by a Kosher poultry business in Brooklyn NY (for one thing, the NIRA regulations interfered with its traditional slaughter practices); with United States v. Butler (January 6, 1936) the government filed a case against a processor of cotton in Illinois who contested paying “processing and floor-stock taxes” to support subsidies for the planters of cotton. The first decision invoked the Commerce Clause of the Constitution; the second invoked the Taxing and Spending Clause which empowers the Federal Government to impose taxes.
In reaction, Roosevelt and his congressional allies put together a plan in 1937 “to pack the court” by adding six additional justices to its roster. The maneuver failed, treated with opprobrium by many. However, with the appointment of new justices to replace retiring justices, Roosevelt soon had a Court more to his liking and also by then the New Deal people had learned from experience not to push programs that were too clumsy to pass legal muster.
The core programs launched by the AAA were continued thanks to subsequent legislation that was upheld in later cases before the Court. The pro-labor part of the NIRA was rescued by the National Labor Relations Act (aka the Wagner Act) of 1935. The act which protected labor unions was sponsored by Prussian-born Senator Robert F. Wagner Sr.; his feckless son Robert Jr. was the Mayor of New York who enabled its two National League baseball teams to depart for the West Coast in 1957 – two teams that between them had won the National League pennant every fall for the previous six seasons, two teams with stadium-filling heroes like Willie Mays and Sandy Koufax; moreover, the Dodgers would never have been able to treat fan-favorite Carl Furillo so shabbily had the team still been in Brooklyn: he would not have been dropped mid-season which precisely made him ineligible for the pension of a 15 year veteran, would not have had to go to court to obtain money still due him and would not have been blackballed from organized baseball.

The Dred Scott Decision

Early in its history, the U.S. Supreme Court applied judicial review to acts of Congress. First here was Hylton v. United States (1796) and there was Marbury v. Madison (1803); with these cases the Court’s power to decide the constitutionality of a law was established – constitutional in the first case, unconstitutional in the second. But it would take over 50 years for the Court again to declare a law passed by Congress and signed by the President to be unconstitutional. Moreover, this fateful decision would push the North and South apart to the point of no return. From the time of the Declaration of Independence, the leadership of the country had navigated carefully to maintain a union of free and slave states; steering this course was delicate and full of cynical calculations. How did these orchestrated compromises keep the peace between North and South? Mystère.

During the Constitutional Convention (1787), to deal with states’ rights and with “the peculiar institution” of chattel slavery, two key arrangements were worked out. The Connecticut Compromise favored small states by according them the same number of Senators as the larger states; the Three-Fifths Compromise included 3/5ths of enslaved African Americans in a state’s population count for determining representation in the House of Representatives and, thus, in the Electoral College as well. The Electoral College itself was a compromise between those who wanted direct election of the President and those, like Madison and Hamilton, who wanted a buffer between the office and the people – it has worked well in that 5 times the system has put someone in the office of President who did not win the popular vote.

The compromise juggernaut began anew in 1820 with an act of Congress known as the Missouri Compromise which provided for Maine to enter the union as a free state and for Missouri to enter as a slave state. It also set the southern boundary of Missouri, 36° 30′, as the northern boundary for any further expansion of slavery; at this point in time, the only U.S. land west of the Mississippi River was the Louisiana Territory and so the act designated only areas of today’s Arkansas and Oklahoma as potential slave states; click HERE . (This landscape would change dramatically with the annexation of Texas in 1845 and the Mexican War of 1848.)

Then there was the Compromise Tariff of 1833 that staved off a threat to the Union known as the Nullification Crisis, a drama staged by John C. Calhoun of South Carolina, Andrew Jackson’s Vice-President at the time. The South wanted lower tariffs on finished goods and the North wanted lower tariffs on raw materials. Calhoun was a formidable and radical political thinker and is known as the “Marx of the Master Class.” His considerable fortune went to his daughter and then to his son-in-law Thomas Green Clemson, who in turn in 1888 left most of this estate to found Clemson University, which makes one wonder why Clemson did not name the university for his illustrious father-in-law.

As a result of the Mexican War, in 1848, Alta California became a U.S. territory. The area was already well developed with roads (e.g. El Camino Real), with cities (e.g. El Pueblo de la Reina de Los Angeles), with Jesuit prep schools (e.g. Santa Clara) and with a long pacified Native American population, herded together by the Spanish mission system. With the Gold Rush of 1849, the push for statehood became unstoppable. This led to the Great Compromise (1850) which admitted California as a free state and which instituted a strict fugitive slave law designed to thwart Abolitionists and the Underground Railroad. Henry Clay of Kentucky was instrumental in all three of these nineteenth century compromises which earned him the titles “the Great Compromiser” and “the Great Pacificator,” both of which school textbooks like to perpetuate. Clay, who ran for President three times, is also known for stating “I would rather be right than be President” which sounds so quaint in today’s world where “truth is not truth” and where facts can yield to “alternative facts.”

In 1854, the Missouri Compromise was modified by the Kansas-Nebraska Act that was championed by Lincoln’s opponent for Senator Stephen Douglas; this act applied “squatter sovereignty” to the territories of Kansas and Nebraska which were north of 36° 30′ – this meant that the settlers there themselves would decide whether to outlaw slavery or not. Violence soon broke out pitting free-staters (John Brown and his sons among them) against pro-slavery militias from neighboring Missouri, all of which led to the atrocities of “bleeding Kansas.”

But even the atrocities in Kansas did not overthrow the balance of power between North and South. So how did a Supreme Court decision in 1857 undo over 80 years of carefully orchestrated compromises between the North and South? Mystère.

In 1831, Dred Scott, a slave in Missouri, was sold to Dr. John Emerson, a surgeon in the U.S. army. Emerson took Scott with him as he spent several years in the free state of Illinois and in the Wisconsin Territory where slavery was outlawed by Northwest Ordinance of 1787 and by the Missouri Compromise itself. When in the Wisconsin Territory, Scott married Harriet Robinson, also a slave; the ceremony was performed by a Justice of the Peace. Logically, this meant that they were not considered slaves anymore because in the U.S. at that time, slaves were prohibited from marrying because they could not enter into a legal contract; legal marriage, on the other hand, has been the basis of transmission of property and accumulation of wealth and capital since ancient Rome.

Some years later, back in Missouri, with help from an abolitionist pastor and others, Scott sued for his freedom on the grounds that his stay in free territory was tantamount to manumission; this long process began in 1846. For an image of Dred Scott, the plaintiff and the individual, click HERE .

Previous cases of this kind had been decided in the petitioner’s favor; but, due to legal technicalities and such, this case reached the Missouri Supreme Court where the ruling went against Scott; from there it went to the U.S. Supreme Court.

John Marshall was the fourth Chief Justice and served in that capacity from 1801 to 1835. His successor, appointed by Andrew Jackson, was Roger Taney (pronounced “Tawny”) of Maryland. Taney was a Jackson loyalist and also a Roman Catholic, the first but far from the last Catholic to serve on the Court.

In 1857, the Court declared the Missouri Compromise to be flat-out unconstitutional in the most egregious ruling in its history, the Dred Scott Decision. This was the first time since Marbury that a federal law was declared unconstitutional: in the 7-2 decision penned by Taney himself, the Chief Justice asserted that the federal government had no authority to control slavery in territories acquired after the creation of the U.S. as a nation, meaning all the land west of the Mississippi. Though not a matter before the Court, Taney ruled that even free African Americans could not be U.S. citizens and drove his point home with painful racist rhetoric that former slaves and their descendants “had no rights which the white man was bound to respect.” The Dred Scott Decision drove the country straight towards civil war.

Scott himself soon gained his freedom thanks to a member of a family who had supported his case. But sadly he died from tuberculosis in 1858 in St. Louis. Scott and his wife Harriet have been honored with a plaque on the St. Louis Walk of Fame along with Charles Lindbergh, Chuck Berry and Stan Musial; in Jefferson City, there is a bronze bust of Scott in the Hall of Famous Missourians, along with Scott Joplin, Walt Disney, Walter Cronkite, and Rush Limbaugh making for some strange bedfellows.

President James Buchanan did approve of the Taney decision, however, thinking it put the slavery question to rest. This is certainly part of the reason Buchanan used to be rated as the worst president in U.S. history. It is also thought by historians that Buchanan illegally consulted with Taney before the decision came down, perhaps securing Buchanan’s place in the rankings for the near future despite potential new competition in this arena.

The Dred Scott Decision wrecked the reputation of the Court for years – how blinded by legalisms could justices be as not to realize what their rulings actually said! Charles Evans Hughes, Chief Justice from 1930 to 1941 and foe of FDR and his New Deal, lamented how the Dred Scott Decision was the worst example of the Court’s “self-inflicted wounds.” The Court did recover in time, however, to return to the practice of debatable, controversial decisions.

To start, in 1873, it gutted the 14th Amendment’s protection of civil rights by its 5-4 decision in the Slaughterhouse Cases, a combined case from New Orleans where a monopoly over slaughter houses had been set up by the State Legislature. The decision seriously weakened the “privileges and immunities” clause of the Amendment:

    No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States.

There is a subtlety here: the U.S. Constitution’s Bill of Rights protects the citizen from abuse by the Federal Government; it is left to each state to have and to enforce its own protections of its citizens from abuse by the state itself. This clause was designed to protect the civil liberties of the nation’s new African-American citizens in the former slave states. The damage done by this decision would not be undone until the great civil rights cases of the next century.

 In the Civil Rights Cases of 1883, the Supreme Court declared the Civil Rights Act of 1875 to be unconstitutional, thereby authorizing racial discrimination by businesses setting the stage for Jim Crow legislation in former Confederate states and border states such as Maryland, Missouri and Kentucky – thus undoing the whole point of Reconstruction.

On the other hand, sandwiched around the Civil Rights Cases, were some notable decisions that supported civil rights such as Strauder v. West Virginia (1880), Boyd v. United States (1886) and Yick Wo v. Hopkins (1886). The first case was brought to the Court by an African American and the third of these was brought by a Chinese American.

Somewhat later, during the “Gay Nineties,” the Supreme Court laid down some fresh controversial decisions to usher the U.S. into the new century. In 1894 there was in Re Debs, where the court allowed the government to obtain an injunction and use federal troops to end a strike against the Pullman Company. As a practical matter, this unanimous decision curbed the growing power of labor unions and for the next forty years, “big business” would use court injunctions to suppress strikes. The “Debs” in this case was Eugene Debs, then the head of the American Railway Union; later the Court would uphold the Sedition Act of 1918 to rule against Debs in a case involving his speaking out against American entry into WWI.

It is interesting to note that this time with in Re Debs the Court did not follow papal guidelines as it had with the Discovery Doctrine in Johnson v. McIntosh and other cases under John Marshall. This despite the fact that it had been handed an opportunity to do so. In his 1891 encyclical De Rerum Novarum (“On Revolution”), Pope Leo XIII had come out in favor of labor unions; this was done at the urging of American cardinals and bishops who, at that time, were a progressive force in the Catholic Church. Without the urging of U.S. hierarchy, the pope would likely have condemned unions as secret societies lumped in with the Masons and Rosicrucians.

As the turn of the century approached, the Court’s ruling in Plesssy v. Ferguson (1896) upheld racial segregation on railroads, in schools and in other public facilities with the tag line “separate but equal.” Considered one of the very worst of the Court’s decisions, it legalized racial segregation for another seventy years. In fact, it has never actually been overturned. The celebrated 1954 case Brown v. Board of Education only ruled against it in the case of schools and educational institutions – one of the clever legal arguments the NAACP made was that, for Law Schools and Medical Schools, “separate but equal” was impossible to implement. Subsequent decisions weakened Plesssy further but technically it is still “on the books.” The case itself dealt with “separate but equal” cars for railway passengers; one of the contributions of the Civil Rights movement to the economy of the New South is that it obviated the need for “separate but equal” subway cars and made modern transportation systems possible in Atlanta and other cities.

The number and range of landmark Supreme Court decisions expanded greatly in the twentieth century and that momentum continues to this day. We have drifted further and further from the view of Montesquieu and Hamilton that the Judiciary should be a junior partner next to the legislative and executive branches of government. The feared gouvernement des juges is upon us.

The Discovery Doctrine

John Marshall, the Federalist from Virginia and legendary fourth Chief Justice of the Supreme Court, is celebrated today for his impact on the U.S. form of government. To start, there is the decision Marbury v. Madison in 1803. In this ruling, the Court set a far-reaching precedent by declaring a law passed by Congress and signed by the President to be inconsistent with the Constitution – which at that point in time was a document six and a half pages long , its twelve amendments included. However, his Court laid down no other rulings of unconstitutionality of federal laws. So what sort of other stratagems did John Marshall resort to in order to leave his mark? Mystėre.
One way the Marshall Court displayed its power was by means of three important cases involving the status and rights of Native Americans. The logic behind the first of these, Johnson v. McIntosh, is astonishing and the case is used in law schools today as a classic example of a bad decision. The basis of this unanimous decision, written by Marshall himself, is a doctrine so medieval, so racist, so Euro-centric, so intolerant, so violent as to beggar belief. Yet it so buried in the record that few are even remotely aware of it today. It is called the Doctrine of Christian Discovery or just the Discovery Doctrine.
Simply put, this doctrine states that a Christian nation has the right to take possession of any territory whose people are not Christians.
The term “Discovery” refers to the fact that the European voyages of discovery (initially out of Portugal and Spain) opened the coast of Africa and then the Americas to European takeovers.
All this marauding was justified (even ordered) by edicts issued by popes written for Christian monarchs.
In his bull (the term for one of these edicts) entitled Romanus Pontifex (1452), Pope Nicholas V, in a burst of Crusader spirit, ordered the Portuguese King Alfonso V to “capture, vanquish, and subdue the Saracens, pagans, and other enemies of Christ,” to “put them into perpetual slavery,” and “to take all their possessions and property.” Columbus himself sailed with instructions to take possession of lands not ruled by Christian leaders. Alexander VI was the quintessential Renaissance pope, famous among other things for making nepotism something of a science – he was the father of Lucrezia Borgia (the passionate femme fatale of paintings, books and films, click HERE ) and of Cesare Borgia (the model for Machiavelli’s prince, click HERE ). In his Bulls of Donation of 1493, Alexander extended to Spain the right and duty to take sovereignty over all non-Christian territories “discovered” by its explorers and conquistadors; and then on behalf of Spain and Portugal, with the Line of Demarcation, Alexander divided the globe into two zones one for each to subjugate.
Not to be left behind, a century or so later, when England and Holland undertook their own voyages of discovery and colonization, they adopted the Discovery Doctrine for themselves despite the Protestant Reformation; France did as well. What is more, after Independence, the Americans “inherited” this privilege; indeed, in 1792, U.S. Secretary of State Thomas Jefferson declared that the Discovery Doctrine would pass from Europe to the newly created U.S. government – interesting that Jefferson, deist that he was, would resort to Christian privilege to further U.S. interests! In American hands, the Discovery Doctrine also gave rise to doctrines like Manifest Destiny and American Exceptionalism.
The emphasis on enslavement in Romanus Pontifex is dramatic. The bull was followed by Portuguese incursion into Africa and Portuguese involvement in the African slave trade, till then a Muslim monopoly. In the 1500’s, African slavery became the norm in New Spain and in New Portugal. In August 1619, when the Jamestown colony was only 12 years old, a ship that the Dutch had captured from Portuguese slavers reached the English settlement and Africans were traded for provisions – one simple application of the Discovery Doctrine, one fateful day for the U.S.
Papal exhortations to war were not new in 1452. A bull of Innocent III in 1208 instigated a civil war in France, the horrific Albigensian Crusade. Earlier, in 1155 the English conquest of Ireland was launched by a bull of Pope Adrian IV (the only English pope no less); this conquest has proved long and bloody and has created issues still unresolved today. And even earlier there was the cry “God Wills It” (“Deus Vult”) of Pope Urban II and the First Crusade.
Hopping forward to the U.S. of 1823, in Johnson v. McIntosh, the plaintiff group, referred to as “Johnson,” claimed that a purchase of land from Native Americans in Indiana was valid although the defendant McIntosh for his part had a claim to overlapping land from a federal land grant (federal would prove key). An earlier lower court had dismissed the Johnson claim. Now (switching to the historical present) John Marshall, writing for a unanimous court, reaffirms the lower court’s dismissal of the Johnson suit. But that isn’t enough. After a lengthy discussion of the history of the European voyages of discovery in the Americas, Marshall focuses on the manner in which each European power acquired land from the indigenous occupants. He outlines the Discovery Doctrine and how a European power gains sovereignty over land its explorers “discover”; he adds that the U.S. inherited this power from Great Britain and reaches the conclusion that only the Federal Government can obtain title to Native American land. Furthermore, he concludes that indigenous populations only retain the “right of occupancy” in their lands and that this right can still be dissolved by the Federal Government.
One of the immediate upshots of this decision was that only the Federal Government could purchase land from Native Americans. Going forward, this created a market with only one buyer; a monopoly is created when there is only one seller; a market like this one with only one buyer is called a monopsony, a situation which could work against Native American interests – for the pronunciation of monopsony, click HERE . To counter the efforts of the Apple Computer company to muddy the waters, there’s just “one more thing”: the national apple of Canada, the name of the defendant in this case and the name of the inventor of the stylish raincoat are all written “McIntosh” and not “Macintosh.” (“Mc” is the medieval scribes’ abbreviation of “Mac” the Gaelic patronymic of Ireland and Scotland; other variants include “M’c”, “M'” and “Mc” with the “c” raised with two dots or a line underneath it.)
The decision in Johnson formalized the argument made by Jefferson that the Discovery Doctrine applied to relations between the U.S. government and Native Americans. This doctrine is still regularly cited in federal cases and only recently the Discovery Doctrine was invoked by none other than Justice Ruth Bader Ginsburg writing for the majority in City of Sherrill v. Oneida Indian Nation of New York (2005), a decision which ruled against Oneida claims to sovereignty over once tribal lands that the Oneida had managed to re-acquire!
What has happened here with Johnson is that John Marshall made the Discovery Doctrine part of the law of the land thanks to the common law reliance on precedent. A similar thing happens when a ruling draws on the natural law of Christian theology, a practice known as “natural law jurisprudence.” In effect, in both scenarios, the Court is making law in the sense of legislation as well as in the sense of a judicial ruling.
A few years after Johnson, in response to the state of Georgia’s efforts to badger the Cherokee Nation in an effort to drive them off their lands, the Cherokee asked the Supreme Court for an injunction to put a stop to the state’s practices. The case Cherokee Nation v. Georgia (1831) was dismissed by the Court on a technicality drawn from its previous decision – the Cherokee, not being a foreign nation but rather a “ward to its guardian” the Federal Government, did not have standing to sue before the Court; thereby adding injury to the insult that was Johnson.
The next year Marshall actually made a ruling in favor of the Cherokee nation in Worcester v. Georgia (1832) which laid the foundation for tribal sovereignty over their lands. However, this was not enough to stop Andrew Jackson from carrying out the removal of the Cherokee from Georgia in the infamous Trail of Tears. In fact, confronted with Marshall’s decision, Jackson is reported to have said “Let him enforce it.”
The U.S. is not the only country to use the Discovery Doctrine. In the English speaking world, it has been employed in Australia, New Zealand and elsewhere. In the Dutch speaking world, it was used recently in 1975 with the accession of Suriname to independence where it is the basis for the rights (or lack of same) of indigenous peoples. Even more recently in 2007, the Russian Federation invoked it when placing its flag on the floor of the Arctic Ocean to claim oil and gas reserves there. Interesting that Orthodox Christians would honor papal directives once it was in their economic interest – reminiscent of Jefferson
In addition to Marbury and the cases dealing with Native Americans, there are several other Marshall Court decisions that are accorded “landmark” status today such as McCulloch v Maryland (1819), Cohens v. Virginia (1821) and Gibbons v. Ogden (1824) – all of which established the primacy of federal law and authority over the states. This consistent assertion of federal authority is the signature achievement of John Marshall.
Marshall’s term of 34 years is the longest for a Chief Justice. While his Court did declare state laws unconstitutional, for the Supreme Court to declare another federal law unconstitutional would take over half a century after Marbury. This would be the case that plunged the country into civil war. Affaire à suivre. More to come.

Marbury v. Madison

The Baron de Montesquieu and James Madison believed in the importance of the separation of powers among the executive, legislative and judicial branches of government. However, their view was that the third branch would not have power equal to that of either of the first two but enough so that no one branch would overpower the other two. In America today, things have shifted since 1789 when the Constitution became the law of the land: the legislative branch stands humbled by the reach of executive power and thwarted by endless interference on the part of the judiciary.
The dramatically expanded role of the executive can be traced to the changes the country has gone through since 1789 and the quasi-imperial military and economic role it plays in the world today.
The dramatically increased power of the judiciary is largely due to judicial review:
(a) the practice whereby a court can interpret the text of the Constitution itself or of a law passed by the Congress and signed by the President and tell us what the law “really” means, and
(b) the practice whereby a court can declare a law voted on by the Congress and signed by the President to be unconstitutional.
In fact, the term “unconstitutional” now so alarms the soul that it is even the title of Colin Quinn’s latest one-man show.
Things are very different in other countries. In the U.K., the Parliament is sovereign and its laws mean what Parliament says they mean. In France, in the Constitution of the Fifth Republic (1958) the reach of the Cour Constitutionelle is very limited: Charles DeGaulle in particular was wary of the country’s falling into a gouvernement des juges – this last expression being a pejorative term for a situation like that in U.S. today where judges have power not seen since the time of Gideon and Samuel of the Hebrew Bible.
The U.S. legal system is based on the Norman French system (hence trial by a jury of one’s peers, “voir dire” and “oyez, oyez”) and its evolution into the British system of common law (“stare decisis” and the doctrine of precedence). So why is the U.S. so different from countries with whom the U.S. has so much in common in terms of legal culture? How did this come about? In particular, where does the power to declare laws to be unconstitutional come from? Mystėre.
A famous early example of Judicial Review occurred in Jacobean England about the time of the Jamestown settlement and about the time the King James Bible was finished. In 1610, in a contorted dispute know as Dr. Bonham’s Case over the right to practice medicine, Justice Edward Coke opined in his decision that “in many cases, the common law will control Acts of Parliament.” This was not well received, Coke lost his job and Parliamentary Sovereignty became established in England. Picking himself up, Coke went on to write his Institutes of the Lawes of England which became a foundational text for the American legal system and which is often cited in Supreme Court decisions, an example being no less a case than Roe v. Wade.
Another English jurist who had a great influence on the American colonists in the 18th century was Sir William Blackstone. His authoritative Commentaries on the Laws of England of 1765 became the standard reference on the Common Law, and in this opus, parliamentary sovereignty is unquestioned. The list of subscribers to the first edition of the Commentaries included future Chief Justices John Jay and John Marshall and even today the Commentaries are cited in Supreme Court decisions between 10 and 12 times a year. Blackstone had his detractors, however: Alexis de Tocqueville described him as “an inferior writer, without liberality of mind or depth of judgment.”
Blackstone notwithstanding, judicial review naturally appealed to the colonists: they were the target of laws enacted by a parliament where they had no representation; turning to the courts was the only recourse they had. Indeed, a famous and stirring call for the courts to overturn an act of the British Parliament was made by James Otis of Massachusetts in 1761. The Parliament had just renewed the hated writs of assistance and Otis argued (brilliantly it is said) that the writs violated the colonists’ natural rights and that any act of Parliament that took away those rights was invalid. Still, the court decided in favor of Parliament. Otis’ appeal to natural rights harkens back to Coke and Blackstone and to the natural law concept that was developed in the late Middle Ages by Thomas Aquinas and other scholastic philosophers. Appeal to natural law is “natural” when working in common law systems where there is no written text to fall back on; it is dangerous, however, in that it tugs at judges’ religious and emotional sensibilities.
Judicial review more generally emerged within the U.S. in the period under the Articles of Confederation where each state had its own constitution and legal system. By 1787, state courts in 7 of the 13 states had declared laws enacted by the state legislatures to be invalid.
A famous example of this took place in Massachusetts where slavery was still legal when the state constitution went into effect. Subsequently, in a series of cases known collectively as the Quock Walker Case, the state supreme court applied judicial review to overturn state law as unconstitutional and to abolish slavery in Massachusetts in 1783.
As another example at the state level before the Constitution, in New York the state constitution provided for a Council of Revision which applied judicial review to all bills before they could become law; however, a negative decision by the Council could be overturned by a 2/3 majority vote in both houses of the state legislature.
In 1784 in New York, in the Rutgers v. Waddington case, Alexander Hamilton, taking a star turn, argued that a New York State law known as the Trespass Act, which was aimed at punishing Tories who had stayed loyal to the Crown during the Revolutionary War, was invalid. Hamilton’s argument was that the act violated terms of the Treaty of Paris of 1783; this treaty put an end to the Revolutionary War and in its Articles VI and VII addressed the Tories’ right to their property. Clearly Hamilton wanted to establish that federal treaties overruled state law but also he would well have wanted to keep Tories and their money in New York. Indeed, the British were setting up the English speaking Ontario Province in Canada to receive such émigrés including a settlement on Lake Ontario alluringly named York – which later took back its original Native Canadian name Toronto. For a picture of life in Toronto in the old days, click HERE .
The role of judicial review came up in various ways at the Constitutional Convention of 1787. For example, with the Virginia Plan, Madison wanted there to be a group of judges to assist the president in deciding to veto a bill or not, much like the New York State Council of Revision – and here too this could be overturned by a supermajority vote in Congress. The Virginia Plan was not adopted; many at the Convention saw no need for an explicit inclusion of judicial review in the final text but they did expect the courts to be able to exercise constitutional review. For example, Elbridge Gerry of Massachusetts (and later of gerrymander fame) said federal judges “would have a sufficient check against encroachments on their own department by their exposition of the laws, which involved a power of deciding on their constitutionality.” Luther Martin of Maryland (though born in New Jersey) added that as “to the constitutionality of laws, that point will come before the judges in their official character…. “ For his part, Martin found that the Constitution as drawn up made for too strong a central government and opposed its ratification.
The Federalist Papers were newspaper articles and essays written by the founding fathers John Jay, James Madison and Alexander Hamilton, all using the pseudonym “Publius,” a tip of the hat to the great Roman historian Publius Cornelius Tacitus; for the relevance of Tacitus across time, try Tacitus by historian Ronald Mellor. A group formed around Hamilton and Jay giving birth to the Federalist Party, the first national political party – it stood for a strong central government run by an economic elite; this quickly gave rise to an opposition group the Democratic Republicans (Jefferson, Burr, …) and the party system was born, somewhat to the surprise of those who had written the Constitution. Though in the end, judicial review was left out of the Constitution, right after the Convention, the need for it was brought up again in the Federalist Papers : in June 1788 Hamilton, already a star, published Federalist 78 in which he argued for the need for judicial review of the constitutionality of legislation as a check on abuse of power by the Congress. In this piece, he also invokes Montesquieu on the relatively smaller role the judiciary should have in government compared to the other two.
Fast forward two centuries: the Federalist Society is a political gate-keeper which was founded in 1982 to increase the number of right-leaning judges on the federal courts. Its founders included such high-profile legal thinkers as Robert Bork (whose own nomination to the Supreme Court was so dramatically scuttled by fierce opposition to him that it led to a coinage, the verb “to bork”). The Society regrouped and since then members Antonin Scalia, John G. Roberts, Clarence Thomas, Samuel Alito and Neil Gorsuch have acceded to the Supreme Court itself. (By the way, Federalist 78 is one of their guiding documents.)
Back to 1788: Here is what Section 1 of Article III of the Constitution states:                     The judicial Power of the United States shall be vested in one Supreme Court, and in such inferior Courts as the Congress may from time to time ordain and establish.
Section 2 of Article III spells out the courts’ purview:                                                           The judicial Power shall extend to all Cases, in Law and Equity, arising under this Constitution, the Laws of the United States, and Treaties made, or which shall be made, under their Authority;—to all Cases affecting Ambassadors, other public Ministers and Consuls;—to all Cases of admiralty and maritime Jurisdiction;—to Controversies to which the United States shall be a Party;—to Controversies between two or more States;—between a State and Citizens of another State;—between Citizens of different States;—between Citizens of the same State claiming Lands under Grants of different States, and between a State, or the Citizens thereof, and foreign States, Citizens or Subjects.
So while Article III lays out responsibilities for the court system, it does not say the courts have the power to review the work of the two other branches of government nor call any of it unconstitutional.
Clause II of Article 6 of the Constitution is known as the Supremacy Clause and states that federal law overrides state law. In particular, this would imply that a federal court could nullify a law passed by a state. But, again, it does not allow for the courts to review federal law.
So there is no authorization of judicial review in the U.S. Constitution. However, given the precedents from the state courts and the positions of Madison, Gerry, Martin, Hamilton et al., it is as though lines from the Federalist 78 such as these were slipped into the Constitution while everyone was looking:
  The interpretation of the laws is the proper and peculiar province of the courts. A constitution is, in fact, and must be regarded by the judges as, a fundamental law. It therefore belongs to them to ascertain its meaning, as well as the meaning of any particular act proceeding from the legislative body. … If there should happen to be an irreconcilable variance between the [Constitution and an act of the legislature], the Constitution ought to be preferred to the statute.
The Supreme Court of the United States (SCOTUS) and the federal court system were created straightaway by the Judiciary Act passed by Congress and signed by President George Washington in 1789.
The first “big” case adjudicated by the Supreme Court was Chisholm v. Georgia (1793). Here the Court ruled in favor of the plaintiff Alexander Chisholm and against the State of Georgia, implicitly ruling that nothing in the Constitution prevented Chisholm from suing the state in federal court. This immediately led to an outcry amongst the states and to the 11th Amendment which precludes a state’s being sued in federal court without that state’s consent. So here the Constitution was itself amended to trump a Court decision.
The precedent for explicit judicial review was set seven years later in 1796 in the case Hylton v. United States: this was the first time that the Court ruled on the constitutionality of a law passed by Congress and signed by the President. It involved the Carriage Act of 1794 which placed a yearly tax of $16 on horse-drawn carriages owned by individuals or businesses. Hylton asserted that this kind of tax violated the powers of federal taxation as laid out in the Constitution while Alexander Hamilton, back in the spotlight, pled the government’s case that the tax was consistent with the Constitution. Chief Justice Oliver Ellsworth and his Court decided in favor of the government, thus affirming the constitutionality of a federal law for the first time; by making this ruling, the Court claimed for itself the authority to determine the constitutionality of a law, a power not provided for in the Constitution but one assumed to come with the territory. This verdict held sway for 101 years; it was overturned in 1895 (Pollock v. Farmers’ Loan and Trust) and then reaffirmed after the passage of the 16th amendment which authorized taxes on income and personal property.
Section 13 of the Judiciary Act of 1789 mandated SCOTUS to order the government to do something specific for a plaintiff if the government is obliged to do so according to law but has failed to do so. In technical terms, the court would issue a writ of mandamus ordering the government to act – mandamus, meaning “we command it,” is derived from the Latin verb mandare.
John Marshall, a Federalist from Virginia, was President John Adam’s Secretary of State. When Jefferson, and not Adams, won the election of 1800, Adams hurried to make federal appointments ahead of Jefferson’s inauguration that coming March; these were the notorious “midnight appointments.” Among them was the appointment of John Marshall himself to the post of Chief Justice of the Supreme Court. Another was the appointment of William Marbury to a judgeship in the District of Columbia. It was Marshall’s job while he was still Secretary of State to prepare and deliver the paperwork and official certifications for these appointments. He failed to accomplish this in time for Marbury and some others; when Jefferson took office he instructed his Secretary of State, James Madison, not to complete the unfinished certifications.
In the “landmark” case Marbury v. Madison (1803), William Marbury petitioned the Court under Section 13 of the Judiciary Act to order the Secretary of State, James Madison, to issue the commission for Marbury to serve as Justice of the Peace in the District of Columbia as the certification was still unfinished thanks to John Marshall, now the Chief Justice. In a legalistic tour de force, the Court affirmed that Marbury was right and that his commission should be issued but ruled against him. John Marshall and his judges declared Section 13 of the Judicial Act unconstitutional because it would (according to the Court) enlarge the authority of the Court beyond that permitted by the Constitution.
Let’s try to analyze the logic of this decision: put paradoxically, the Court could exercise a power not given to it in the Constitution to rule that it could not exercise a power not given to it in the Constitution. Put ironically, it ascribed to itself the power to be powerless. Put dramatically, Marbury, himself not a lawyer, might well have cheered on Dick the Butcher who has the line “let’s kill all the lawyers” in Henry VI, Part 2 – but all this business is less like Shakespeare and more like Aristophanes.
Declaring federal laws unconstitutional did not turn into a habit in the 19th century. The Marshall court itself did not declare any other federal laws to be unconstitutional but it did find so in cases involving state laws. For example, Luther Martin was on the losing side in McCulloch v. Maryland (1819) when the Court declared a Maryland state law levying a tax on a federally authorized national bank to be unconstitutional.
The story doesn’t end there.
Although another law wouldn’t be ruled unconstitutional by the Supreme Court until 1857, the two plus centuries since Marbury would see a dramatic surge in judicial review and outbreaks of judicial activism on the part of courts both left-wing and right-wing. There is a worrisome tilt toward increasing judicial power: “worrisome” because things might not stop there; the 2nd Book of Samuel (the last Judge of the Israelites) is followed by the 1st Book of Kings, something to do with the need to improve national defense. Affaire à suivre. More to come.