Business and Baseball

The twentieth century began in 1901. Teddy Roosevelt became President after William McKinley’s assassination by an anarchist at the Pan American Exposition in Buffalo NY. This would prove a challenging time for the Supreme Court and judicial review. By the end of the century the power and influence of the Court over life in America would far exceed the limits stipulated by the Baron de Montesquieu in The Spirit of the Laws or those predicted by the analysis of Alexander Hamilton in Federalist 78.
Normally, the most visible of the justices on the Court is the Chief Justice but in the period from 1902 till 1932, the one most quotable was Associate Justice Oliver Wendell Holmes Jr. Holmes Sr. was the famous physician, writer and poet, author of Old Ironsides and other entries in the K-12 canon. For his part, Holmes Jr. wrote Supreme Court decisions and dissents that have become part of the lore of the Court.
In 1905, the 5-4 Court ruled against the state of New York in one of its more controversial decisions, Lochner v. New York. Appealing to laissez-faire economics, the majority ruled that the state did not have the authority to limit bakery workers hours to 10 hours a day, 60 hours a week even if the goal was to protect the workers’ health and that of the public. The judges perverted the Due Process Clause of the 14th Amendment which reads.
    [Nor] shall any State deprive any person of life, liberty, or property, without due      process of law
They invoked this clause of a civil rights Amendment to rule that the New York law interfered with an individual baker’s right to enter into a private contract. In his dissent, Holmes attacked the decision for applying the social Darwinism of Herbert Spencer (coiner of the phrase “survival of the fittest”) to the Constitution; rather pointedly, Hughes wrote
    The Fourteenth Amendment does not enact Mr. Herbert Spencer’s Social Statics.
Over time, the anti-labor aspects of this decision were undone by legislation but its influence on the discussion of “due process” continues. It has given rise to the verb “lochnerize” which is defined thusly by Wiktionary:
    To read one’s policy preferences into the Constitution, as was (allegedly) done by the U.S. Supreme Court in the 1905 case Lochner v. New York.
The parenthetical term “alledgedly” presumably refers to Holmes’ critique. Two other contributions of Lochner to the English language are the noun “Lochnerism” and the phrase “The Lochner Era.”
In 1917, Congress passed the Espionage Act which penalized protests and actions that contested American participation in WWI. This law and its added amendments in the Sedition Act (1918) were powerful tools for suppressing dissent, something pursued quite vigorously by the Wilson administration. A challenge to the act followed quickly with Schenk v. United States (1919). The Court ruled in favor of the Espionage Act unanimously; Holmes wrote the opinion and created some oft cited turns of phrase:
    The most stringent protection of free speech would not protect a man in falsely shouting fire in a theatre and causing a panic.
    The question … is whether the words used … create a clear and present danger that .. will bring about the substantive evils that Congress has a right to prevent.
Holmes’ opinion notwithstanding, the constitutionality of the Espionage Act is still debated because of its infringement on free speech.
Schenck was then followed by another case involving the Espionage Act, Debs v. United States (1919). Eugene Debs was the union activist and socialist leader whom the Court had already ruled against in the Pullman case known as In re Debs (1895). Writing again for a unanimous court, Holmes invoked Schenck and ruled that the right of free speech did not protect protest against the military draft. Debs was sentenced to ten years in Prison and disenfranchised; that did not prevent him from running for President in 1920 – he received over 900.000 votes, more than 3% of the total.
Debs was soon pardoned by President Warren G. Harding in 1921 and even invited to the White House! The passionate Harding apparently admired Debs and did not approve of the way he had been treated by Wilson, the Espionage Act and the Court; Harding famously held that “men in Congress say things worse than the utterances” for which Debs was convicted. In 1923, having just announced a campaign to eliminate the rampant corruption in Washington, Harding died most mysteriously in the Palace Hotel in San Francisco: vampire marks on his neck, no autopsy, hasty burial – suspects ranged from a Norwegian seaman to Al Capone hit men to Harding’s long suffering wife. Harding was succeeded by Calvin Coolidge who is best remembered for his insight into the soul of the nation: “After all, the chief business of the American people is business.”
Although the Sedition Act amendments were repealed in 1921, the Espionage Act itself lumbers on. It has been used in more recent times against Daniel Ellsberg and Edward Snowden.
Today, Holmes is also remembered for his opinion in an anti-trust suit pitting a “third major league” against the established “big leagues.” The National League had been a profitable enterprise since 1876, with franchises stretching from Boston to St. Louis. At the top of the century, in 1901, the then rival American League was formed but the two leagues joined together in time for the first World Series in 1903. The upstart Federal League managed to field eight teams for the 1914 and 1915 seasons, but interference from the other leagues forced them to end operations. A suit charging the National and American leagues with violating the Sherman Anti-Trust Act was filed in 1915 and it was heard before Judge Kenesaw Mountain Landis – who, interestingly, was to become the righteous Commissioner of Baseball following the Black Sox Scandal of 1919. In Federal Court Landis dramatically slow-walked the Federal League’s case and the result was that different owners made various deals, some buying into National or American League teams and/or folding their teams into established teams; the exception was the owner of the Baltimore Terrapins franchise – the terrapin is a small turtle from Maryland, but the classic name for a Baltimore team is the Orioles; perhaps, the name still belonged to the New York Yankees organization since the old Baltimore Orioles, when dropped from the National League, joined the new American League in 1901 and then moved North to become the New York Highlanders in 1903, that name being changed to New York Yankees in 1913. Be that as it may, the league-less Terrapins continued to sue the major leagues for violating anti-trust law; this suit made its way to the Supreme Court as Federal Baseball Club v. National League.
In 1922, writing for a unanimous court in the Federal case, Holmes basically decreed that Major League Baseball was not a business enterprise engaged in interstate commerce; with Olympian authority, he wrote:
    The business [of the National League] is giving exhibitions of baseball. … the exhibition, although made for money, would not be called trade or commerce in the commonly accepted use of those words.
So, this opinion simply bypasses the Commerce Clause of the Constitution, which states that the Congress shall have power
    To regulate Commerce with foreign Nations, and among the several States, and with the Indian Tribes.
With this verdict, Major League Baseball, being a sport and not a business, was exempt from anti-trust regulations such. This enabled “the lords of baseball” to continue to keep a player bound to the team that first signed that player; the mechanism for this was yet another clause, the “reserve clause” which was attached to all the players’ contracts. The “reserve clause” also allowed a team (but not a player) to dissolve the player’s contract on 10 days notice. Obviously this had the effect of depressing player salaries. It also led to outrages such as Sal “The Barber” Maglie’s being blackballed for three seasons for having played in the Mexican League and such as the Los Angeles Dodgers’ treatment of Brooklyn great, Carl “The Reading Rifle” Furillo. Interestingly, although both were truly star players, neither is “enshrined” in the Baseball Hall of Fame; Furillo, though, is featured in Roger Kahn’s classic The Boys of Summer and both Furillo and Maglie take the field in Doris Kearns Goodwin’s charming memoir Wait Till Next Year.
The Supreme Court judgment in Federal was mitigated by subsequent developments that were set in motion by All-Star outfielder Curt Flood’s courageous challenge to the “reserve clause” in the suit Flood v. St Louis (1972); this case was decided against Flood by the Supreme Court in a 5-3 decision that was based on the precedent of the Federal ruling; in this case justice Lewis Powell recused himself because he owned stock in Anheuser-Busch, the company that owned the St. Louis franchise – an honorable thing to do but something which exposes the potential class bias of the Court that might lie behind decisions favoring corporations and the powerful. Though Flood lost, there were vigorous dissents by Justices Marshall, Douglas and Brennan and his case rattled the system; the players union was then able to negotiate for free agency in 1975. However, because of the anti-trust exemption, Major League Baseball still has much more control over its domain than do other major sports leagues even though the NFL and the NCAA benefit from legislation exempting them too from some anti-trust regulations.
In 1932 when Franklin D. Roosevelt became President, the U.S. was nearly three years into the Great Depression. With the New Deal, the congress moved quickly to enact legislation that would serve both to stimulate the economy and to improve working conditions. In the First Hundred Days of the Roosevelt presidency, the National Industrial Recovery Act (NIRA) and the Agricultural Adjustment Act (AAA) were passed. Both were then declared unconstitutional in whole or in part by the Court under Chief Justice Charles Evans Hughes: the case Schechter Poultry Corp. v. United States (1935) was brought by a Kosher poultry business in Brooklyn NY (for one thing, the NIRA regulations interfered with its traditional slaughter practices); with United States v. Butler (January 6, 1936) the government filed a case against a processor of cotton in Illinois who contested paying “processing and floor-stock taxes” to support subsidies for the planters of cotton. The first decision invoked the Commerce Clause of the Constitution; the second invoked the Taxing and Spending Clause which empowers the Federal Government to impose taxes.
In reaction, Roosevelt and his congressional allies put together a plan in 1937 “to pack the court” by adding six additional justices to its roster. The maneuver failed, treated with opprobrium by many. However, with the appointment of new justices to replace retiring justices, Roosevelt soon had a Court more to his liking and also by then the New Deal people had learned from experience not to push programs that were too clumsy to pass legal muster.
The core programs launched by the AAA were continued thanks to subsequent legislation that was upheld in later cases before the Court. The pro-labor part of the NIRA was rescued by the National Labor Relations Act (aka the Wagner Act) of 1935. The act which protected labor unions was sponsored by Prussian-born Senator Robert F. Wagner Sr.; his feckless son Robert Jr. was the Mayor of New York who enabled its two National League baseball teams to depart for the West Coast in 1957 – two teams that between them had won the National League pennant every fall for the previous six seasons, two teams with stadium-filling heroes like Willie Mays and Sandy Koufax; moreover, the Dodgers would never have been able to treat fan-favorite Carl Furillo so shabbily had the team still been in Brooklyn: he would not have been dropped mid-season which precisely made him ineligible for the pension of a 15 year veteran, would not have had to go to court to obtain money still due him and would not have been blackballed from organized baseball.

The Dred Scott Decision

Early in its history, the U.S. Supreme Court applied judicial review to acts of Congress. First here was Hylton v. United States (1796) and there was Marbury v. Madison (1803); with these cases the Court’s power to decide the constitutionality of a law was established – constitutional in the first case, unconstitutional in the second. But it would take over 50 years for the Court again to declare a law passed by Congress and signed by the President to be unconstitutional. Moreover, this fateful decision would push the North and South apart to the point of no return. From the time of the Declaration of Independence, the leadership of the country had navigated carefully to maintain a union of free and slave states; steering this course was delicate and full of cynical calculations. How did these orchestrated compromises keep the peace between North and South? Mystère.

During the Constitutional Convention (1787), to deal with states’ rights and with “the peculiar institution” of chattel slavery, two key arrangements were worked out. The Connecticut Compromise favored small states by according them the same number of Senators as the larger states; the Three-Fifths Compromise included 3/5ths of enslaved African Americans in a state’s population count for determining representation in the House of Representatives and, thus, in the Electoral College as well. The Electoral College itself was a compromise between those who wanted direct election of the President and those, like Madison and Hamilton, who wanted a buffer between the office and the people – it has worked well in that 5 times the system has put someone in the office of President who did not win the popular vote.

The compromise juggernaut began anew in 1820 with an act of Congress known as the Missouri Compromise which provided for Maine to enter the union as a free state and for Missouri to enter as a slave state. It also set the southern boundary of Missouri, 36° 30′, as the northern boundary for any further expansion of slavery; at this point in time, the only U.S. land west of the Mississippi River was the Louisiana Territory and so the act designated only areas of today’s Arkansas and Oklahoma as potential slave states; click HERE . (This landscape would change dramatically with the annexation of Texas in 1845 and the Mexican War of 1848.)

Then there was the Compromise Tariff of 1833 that staved off a threat to the Union known as the Nullification Crisis, a drama staged by John C. Calhoun of South Carolina, Andrew Jackson’s Vice-President at the time. The South wanted lower tariffs on finished goods and the North wanted lower tariffs on raw materials. Calhoun was a formidable and radical political thinker and is known as the “Marx of the Master Class.” His considerable fortune went to his daughter and then to his son-in-law Thomas Green Clemson, who in turn in 1888 left most of this estate to found Clemson University, which makes one wonder why Clemson did not name the university for his illustrious father-in-law.

As a result of the Mexican War, in 1848, Alta California became a U.S. territory. The area was already well developed with roads (e.g. El Camino Real), with cities (e.g. El Pueblo de la Reina de Los Angeles), with Jesuit prep schools (e.g. Santa Clara) and with a long pacified Native American population, herded together by the Spanish mission system. With the Gold Rush of 1849, the push for statehood became unstoppable. This led to the Great Compromise (1850) which admitted California as a free state and which instituted a strict fugitive slave law designed to thwart Abolitionists and the Underground Railroad. Henry Clay of Kentucky was instrumental in all three of these nineteenth century compromises which earned him the titles “the Great Compromiser” and “the Great Pacificator,” both of which school textbooks like to perpetuate. Clay, who ran for President three times, is also known for stating “I would rather be right than be President” which sounds so quaint in today’s world where “truth is not truth” and where facts can yield to “alternative facts.”

In 1854, the Missouri Compromise was modified by the Kansas-Nebraska Act that was championed by Lincoln’s opponent for Senator Stephen Douglas; this act applied “squatter sovereignty” to the territories of Kansas and Nebraska which were north of 36° 30′ – this meant that the settlers there themselves would decide whether to outlaw slavery or not. Violence soon broke out pitting free-staters (John Brown and his sons among them) against pro-slavery militias from neighboring Missouri, all of which led to the atrocities of “bleeding Kansas.”

But even the atrocities in Kansas did not overthrow the balance of power between North and South. So how did a Supreme Court decision in 1857 undo over 80 years of carefully orchestrated compromises between the North and South? Mystère.

In 1831, Dred Scott, a slave in Missouri, was sold to Dr. John Emerson, a surgeon in the U.S. army. Emerson took Scott with him as he spent several years in the free state of Illinois and in the Wisconsin Territory where slavery was outlawed by Northwest Ordinance of 1787 and by the Missouri Compromise itself. When in the Wisconsin Territory, Scott married Harriet Robinson, also a slave; the ceremony was performed by a Justice of the Peace. Logically, this meant that they were not considered slaves anymore because in the U.S. at that time, slaves were prohibited from marrying because they could not enter into a legal contract; legal marriage, on the other hand, has been the basis of transmission of property and accumulation of wealth and capital since ancient Rome.

Some years later, back in Missouri, with help from an abolitionist pastor and others, Scott sued for his freedom on the grounds that his stay in free territory was tantamount to manumission; this long process began in 1846. For an image of Dred Scott, the plaintiff and the individual, click HERE .

Previous cases of this kind had been decided in the petitioner’s favor; but, due to legal technicalities and such, this case reached the Missouri Supreme Court where the ruling went against Scott; from there it went to the U.S. Supreme Court.

John Marshall was the fourth Chief Justice and served in that capacity from 1801 to 1835. His successor, appointed by Andrew Jackson, was Roger Taney (pronounced “Tawny”) of Maryland. Taney was a Jackson loyalist and also a Roman Catholic, the first but far from the last Catholic to serve on the Court.

In 1857, the Court declared the Missouri Compromise to be flat-out unconstitutional in the most egregious ruling in its history, the Dred Scott Decision. This was the first time since Marbury that a federal law was declared unconstitutional: in the 7-2 decision penned by Taney himself, the Chief Justice asserted that the federal government had no authority to control slavery in territories acquired after the creation of the U.S. as a nation, meaning all the land west of the Mississippi. Though not a matter before the Court, Taney ruled that even free African Americans could not be U.S. citizens and drove his point home with painful racist rhetoric that former slaves and their descendants “had no rights which the white man was bound to respect.” The Dred Scott Decision drove the country straight towards civil war.

Scott himself soon gained his freedom thanks to a member of a family who had supported his case. But sadly he died from tuberculosis in 1858 in St. Louis. Scott and his wife Harriet have been honored with a plaque on the St. Louis Walk of Fame along with Charles Lindbergh, Chuck Berry and Stan Musial; in Jefferson City, there is a bronze bust of Scott in the Hall of Famous Missourians, along with Scott Joplin, Walt Disney, Walter Cronkite, and Rush Limbaugh making for some strange bedfellows.

President James Buchanan did approve of the Taney decision, however, thinking it put the slavery question to rest. This is certainly part of the reason Buchanan used to be rated as the worst president in U.S. history. It is also thought by historians that Buchanan illegally consulted with Taney before the decision came down, perhaps securing Buchanan’s place in the rankings for the near future despite potential new competition in this arena.

The Dred Scott Decision wrecked the reputation of the Court for years – how blinded by legalisms could justices be as not to realize what their rulings actually said! Charles Evans Hughes, Chief Justice from 1930 to 1941 and foe of FDR and his New Deal, lamented how the Dred Scott Decision was the worst example of the Court’s “self-inflicted wounds.” The Court did recover in time, however, to return to the practice of debatable, controversial decisions.

To start, in 1873, it gutted the 14th Amendment’s protection of civil rights by its 5-4 decision in the Slaughterhouse Cases, a combined case from New Orleans where a monopoly over slaughter houses had been set up by the State Legislature. The decision seriously weakened the “privileges and immunities” clause of the Amendment:

    No State shall make or enforce any law which shall abridge the privileges or immunities of citizens of the United States.

There is a subtlety here: the U.S. Constitution’s Bill of Rights protects the citizen from abuse by the Federal Government; it is left to each state to have and to enforce its own protections of its citizens from abuse by the state itself. This clause was designed to protect the civil liberties of the nation’s new African-American citizens in the former slave states. The damage done by this decision would not be undone until the great civil rights cases of the next century.

 In the Civil Rights Cases of 1883, the Supreme Court declared the Civil Rights Act of 1875 to be unconstitutional, thereby authorizing racial discrimination by businesses setting the stage for Jim Crow legislation in former Confederate states and border states such as Maryland, Missouri and Kentucky – thus undoing the whole point of Reconstruction.

On the other hand, sandwiched around the Civil Rights Cases, were some notable decisions that supported civil rights such as Strauder v. West Virginia (1880), Boyd v. United States (1886) and Yick Wo v. Hopkins (1886). The first case was brought to the Court by an African American and the third of these was brought by a Chinese American.

Somewhat later, during the “Gay Nineties,” the Supreme Court laid down some fresh controversial decisions to usher the U.S. into the new century. In 1894 there was in Re Debs, where the court allowed the government to obtain an injunction and use federal troops to end a strike against the Pullman Company. As a practical matter, this unanimous decision curbed the growing power of labor unions and for the next forty years, “big business” would use court injunctions to suppress strikes. The “Debs” in this case was Eugene Debs, then the head of the American Railway Union; later the Court would uphold the Sedition Act of 1918 to rule against Debs in a case involving his speaking out against American entry into WWI.

It is interesting to note that this time with in Re Debs the Court did not follow papal guidelines as it had with the Discovery Doctrine in Johnson v. McIntosh and other cases under John Marshall. This despite the fact that it had been handed an opportunity to do so. In his 1891 encyclical De Rerum Novarum (“On Revolution”), Pope Leo XIII had come out in favor of labor unions; this was done at the urging of American cardinals and bishops who, at that time, were a progressive force in the Catholic Church. Without the urging of U.S. hierarchy, the pope would likely have condemned unions as secret societies lumped in with the Masons and Rosicrucians.

As the turn of the century approached, the Court’s ruling in Plesssy v. Ferguson (1896) upheld racial segregation on railroads, in schools and in other public facilities with the tag line “separate but equal.” Considered one of the very worst of the Court’s decisions, it legalized racial segregation for another seventy years. In fact, it has never actually been overturned. The celebrated 1954 case Brown v. Board of Education only ruled against it in the case of schools and educational institutions – one of the clever legal arguments the NAACP made was that, for Law Schools and Medical Schools, “separate but equal” was impossible to implement. Subsequent decisions weakened Plesssy further but technically it is still “on the books.” The case itself dealt with “separate but equal” cars for railway passengers; one of the contributions of the Civil Rights movement to the economy of the New South is that it obviated the need for “separate but equal” subway cars and made modern transportation systems possible in Atlanta and other cities.

The number and range of landmark Supreme Court decisions expanded greatly in the twentieth century and that momentum continues to this day. We have drifted further and further from the view of Montesquieu and Hamilton that the Judiciary should be a junior partner next to the legislative and executive branches of government. The feared gouvernement des juges is upon us.

The Discovery Doctrine

John Marshall, the Federalist from Virginia and legendary fourth Chief Justice of the Supreme Court, is celebrated today for his impact on the U.S. form of government. To start, there is the decision Marbury v. Madison in 1803. In this ruling, the Court set a far-reaching precedent by declaring a law passed by Congress and signed by the President to be inconsistent with the Constitution – which at that point in time was a document six and a half pages long , its twelve amendments included. However, his Court laid down no other rulings of unconstitutionality of federal laws. So what sort of other stratagems did John Marshall resort to in order to leave his mark? Mystėre.
One way the Marshall Court displayed its power was by means of three important cases involving the status and rights of Native Americans. The logic behind the first of these, Johnson v. McIntosh, is astonishing and the case is used in law schools today as a classic example of a bad decision. The basis of this unanimous decision, written by Marshall himself, is a doctrine so medieval, so racist, so Euro-centric, so intolerant, so violent as to beggar belief. Yet it so buried in the record that few are even remotely aware of it today. It is called the Doctrine of Christian Discovery or just the Discovery Doctrine.
Simply put, this doctrine states that a Christian nation has the right to take possession of any territory whose people are not Christians.
The term “Discovery” refers to the fact that the European voyages of discovery (initially out of Portugal and Spain) opened the coast of Africa and then the Americas to European takeovers.
All this marauding was justified (even ordered) by edicts issued by popes written for Christian monarchs.
In his bull (the term for one of these edicts) entitled Romanus Pontifex (1452), Pope Nicholas V, in a burst of Crusader spirit, ordered the Portuguese King Alfonso V to “capture, vanquish, and subdue the Saracens, pagans, and other enemies of Christ,” to “put them into perpetual slavery,” and “to take all their possessions and property.” Columbus himself sailed with instructions to take possession of lands not ruled by Christian leaders. Alexander VI was the quintessential Renaissance pope, famous among other things for making nepotism something of a science – he was the father of Lucrezia Borgia (the passionate femme fatale of paintings, books and films, click HERE ) and of Cesare Borgia (the model for Machiavelli’s prince, click HERE ). In his Bulls of Donation of 1493, Alexander extended to Spain the right and duty to take sovereignty over all non-Christian territories “discovered” by its explorers and conquistadors; and then on behalf of Spain and Portugal, with the Line of Demarcation, Alexander divided the globe into two zones one for each to subjugate.
Not to be left behind, a century or so later, when England and Holland undertook their own voyages of discovery and colonization, they adopted the Discovery Doctrine for themselves despite the Protestant Reformation; France did as well. What is more, after Independence, the Americans “inherited” this privilege; indeed, in 1792, U.S. Secretary of State Thomas Jefferson declared that the Discovery Doctrine would pass from Europe to the newly created U.S. government – interesting that Jefferson, deist that he was, would resort to Christian privilege to further U.S. interests! In American hands, the Discovery Doctrine also gave rise to doctrines like Manifest Destiny and American Exceptionalism.
The emphasis on enslavement in Romanus Pontifex is dramatic. The bull was followed by Portuguese incursion into Africa and Portuguese involvement in the African slave trade, till then a Muslim monopoly. In the 1500’s, African slavery became the norm in New Spain and in New Portugal. In August 1619, when the Jamestown colony was only 12 years old, a ship that the Dutch had captured from Portuguese slavers reached the English settlement and Africans were traded for provisions – one simple application of the Discovery Doctrine, one fateful day for the U.S.
Papal exhortations to war were not new in 1452. A bull of Innocent III in 1208 instigated a civil war in France, the horrific Albigensian Crusade. Earlier, in 1155 the English conquest of Ireland was launched by a bull of Pope Adrian IV (the only English pope no less); this conquest has proved long and bloody and has created issues still unresolved today. And even earlier there was the cry “God Wills It” (“Deus Vult”) of Pope Urban II and the First Crusade.
Hopping forward to the U.S. of 1823, in Johnson v. McIntosh, the plaintiff group, referred to as “Johnson,” claimed that a purchase of land from Native Americans in Indiana was valid although the defendant McIntosh for his part had a claim to overlapping land from a federal land grant (federal would prove key). An earlier lower court had dismissed the Johnson claim. Now (switching to the historical present) John Marshall, writing for a unanimous court, reaffirms the lower court’s dismissal of the Johnson suit. But that isn’t enough. After a lengthy discussion of the history of the European voyages of discovery in the Americas, Marshall focuses on the manner in which each European power acquired land from the indigenous occupants. He outlines the Discovery Doctrine and how a European power gains sovereignty over land its explorers “discover”; he adds that the U.S. inherited this power from Great Britain and reaches the conclusion that only the Federal Government can obtain title to Native American land. Furthermore, he concludes that indigenous populations only retain the “right of occupancy” in their lands and that this right can still be dissolved by the Federal Government.
One of the immediate upshots of this decision was that only the Federal Government could purchase land from Native Americans. Going forward, this created a market with only one buyer; a monopoly is created when there is only one seller; a market like this one with only one buyer is called a monopsony, a situation which could work against Native American interests – for the pronunciation of monopsony, click HERE . To counter the efforts of the Apple Computer company to muddy the waters, there’s just “one more thing”: the national apple of Canada, the name of the defendant in this case and the name of the inventor of the stylish raincoat are all written “McIntosh” and not “Macintosh.” (“Mc” is the medieval scribes’ abbreviation of “Mac” the Gaelic patronymic of Ireland and Scotland; other variants include “M’c”, “M'” and “Mc” with the “c” raised with two dots or a line underneath it.)
The decision in Johnson formalized the argument made by Jefferson that the Discovery Doctrine applied to relations between the U.S. government and Native Americans. This doctrine is still regularly cited in federal cases and only recently the Discovery Doctrine was invoked by none other than Justice Ruth Bader Ginsburg writing for the majority in City of Sherrill v. Oneida Indian Nation of New York (2005), a decision which ruled against Oneida claims to sovereignty over once tribal lands that the Oneida had managed to re-acquire!
What has happened here with Johnson is that John Marshall made the Discovery Doctrine part of the law of the land thanks to the common law reliance on precedent. A similar thing happens when a ruling draws on the natural law of Christian theology, a practice known as “natural law jurisprudence.” In effect, in both scenarios, the Court is making law in the sense of legislation as well as in the sense of a judicial ruling.
A few years after Johnson, in response to the state of Georgia’s efforts to badger the Cherokee Nation in an effort to drive them off their lands, the Cherokee asked the Supreme Court for an injunction to put a stop to the state’s practices. The case Cherokee Nation v. Georgia (1831) was dismissed by the Court on a technicality drawn from its previous decision – the Cherokee, not being a foreign nation but rather a “ward to its guardian” the Federal Government, did not have standing to sue before the Court; thereby adding injury to the insult that was Johnson.
The next year Marshall actually made a ruling in favor of the Cherokee nation in Worcester v. Georgia (1832) which laid the foundation for tribal sovereignty over their lands. However, this was not enough to stop Andrew Jackson from carrying out the removal of the Cherokee from Georgia in the infamous Trail of Tears. In fact, confronted with Marshall’s decision, Jackson is reported to have said “Let him enforce it.”
The U.S. is not the only country to use the Discovery Doctrine. In the English speaking world, it has been employed in Australia, New Zealand and elsewhere. In the Dutch speaking world, it was used recently in 1975 with the accession of Suriname to independence where it is the basis for the rights (or lack of same) of indigenous peoples. Even more recently in 2007, the Russian Federation invoked it when placing its flag on the floor of the Arctic Ocean to claim oil and gas reserves there. Interesting that Orthodox Christians would honor papal directives once it was in their economic interest – reminiscent of Jefferson
In addition to Marbury and the cases dealing with Native Americans, there are several other Marshall Court decisions that are accorded “landmark” status today such as McCulloch v Maryland (1819), Cohens v. Virginia (1821) and Gibbons v. Ogden (1824) – all of which established the primacy of federal law and authority over the states. This consistent assertion of federal authority is the signature achievement of John Marshall.
Marshall’s term of 34 years is the longest for a Chief Justice. While his Court did declare state laws unconstitutional, for the Supreme Court to declare another federal law unconstitutional would take over half a century after Marbury. This would be the case that plunged the country into civil war. Affaire à suivre. More to come.

Marbury v. Madison

The Baron de Montesquieu and James Madison believed in the importance of the separation of powers among the executive, legislative and judicial branches of government. However, their view was that the third branch would not have power equal to that of either of the first two but enough so that no one branch would overpower the other two. In America today, things have shifted since 1789 when the Constitution became the law of the land: the legislative branch stands humbled by the reach of executive power and thwarted by endless interference on the part of the judiciary.
The dramatically expanded role of the executive can be traced to the changes the country has gone through since 1789 and the quasi-imperial military and economic role it plays in the world today.
The dramatically increased power of the judiciary is largely due to judicial review:
(a) the practice whereby a court can interpret the text of the Constitution itself or of a law passed by the Congress and signed by the President and tell us what the law “really” means, and
(b) the practice whereby a court can declare a law voted on by the Congress and signed by the President to be unconstitutional.
In fact, the term “unconstitutional” now so alarms the soul that it is even the title of Colin Quinn’s latest one-man show.
Things are very different in other countries. In the U.K., the Parliament is sovereign and its laws mean what Parliament says they mean. In France, in the Constitution of the Fifth Republic (1958) the reach of the Cour Constitutionelle is very limited: Charles DeGaulle in particular was wary of the country’s falling into a gouvernement des juges – this last expression being a pejorative term for a situation like that in U.S. today where judges have power not seen since the time of Gideon and Samuel of the Hebrew Bible.
The U.S. legal system is based on the Norman French system (hence trial by a jury of one’s peers, “voir dire” and “oyez, oyez”) and its evolution into the British system of common law (“stare decisis” and the doctrine of precedence). So why is the U.S. so different from countries with whom the U.S. has so much in common in terms of legal culture? How did this come about? In particular, where does the power to declare laws to be unconstitutional come from? Mystėre.
A famous early example of Judicial Review occurred in Jacobean England about the time of the Jamestown settlement and about the time the King James Bible was finished. In 1610, in a contorted dispute know as Dr. Bonham’s Case over the right to practice medicine, Justice Edward Coke opined in his decision that “in many cases, the common law will control Acts of Parliament.” This was not well received, Coke lost his job and Parliamentary Sovereignty became established in England. Picking himself up, Coke went on to write his Institutes of the Lawes of England which became a foundational text for the American legal system and which is often cited in Supreme Court decisions, an example being no less a case than Roe v. Wade.
Another English jurist who had a great influence on the American colonists in the 18th century was Sir William Blackstone. His authoritative Commentaries on the Laws of England of 1765 became the standard reference on the Common Law, and in this opus, parliamentary sovereignty is unquestioned. The list of subscribers to the first edition of the Commentaries included future Chief Justices John Jay and John Marshall and even today the Commentaries are cited in Supreme Court decisions between 10 and 12 times a year. Blackstone had his detractors, however: Alexis de Tocqueville described him as “an inferior writer, without liberality of mind or depth of judgment.”
Blackstone notwithstanding, judicial review naturally appealed to the colonists: they were the target of laws enacted by a parliament where they had no representation; turning to the courts was the only recourse they had. Indeed, a famous and stirring call for the courts to overturn an act of the British Parliament was made by James Otis of Massachusetts in 1761. The Parliament had just renewed the hated writs of assistance and Otis argued (brilliantly it is said) that the writs violated the colonists’ natural rights and that any act of Parliament that took away those rights was invalid. Still, the court decided in favor of Parliament. Otis’ appeal to natural rights harkens back to Coke and Blackstone and to the natural law concept that was developed in the late Middle Ages by Thomas Aquinas and other scholastic philosophers. Appeal to natural law is “natural” when working in common law systems where there is no written text to fall back on; it is dangerous, however, in that it tugs at judges’ religious and emotional sensibilities.
Judicial review more generally emerged within the U.S. in the period under the Articles of Confederation where each state had its own constitution and legal system. By 1787, state courts in 7 of the 13 states had declared laws enacted by the state legislatures to be invalid.
A famous example of this took place in Massachusetts where slavery was still legal when the state constitution went into effect. Subsequently, in a series of cases known collectively as the Quock Walker Case, the state supreme court applied judicial review to overturn state law as unconstitutional and to abolish slavery in Massachusetts in 1783.
As another example at the state level before the Constitution, in New York the state constitution provided for a Council of Revision which applied judicial review to all bills before they could become law; however, a negative decision by the Council could be overturned by a 2/3 majority vote in both houses of the state legislature.
In 1784 in New York, in the Rutgers v. Waddington case, Alexander Hamilton, taking a star turn, argued that a New York State law known as the Trespass Act, which was aimed at punishing Tories who had stayed loyal to the Crown during the Revolutionary War, was invalid. Hamilton’s argument was that the act violated terms of the Treaty of Paris of 1783; this treaty put an end to the Revolutionary War and in its Articles VI and VII addressed the Tories’ right to their property. Clearly Hamilton wanted to establish that federal treaties overruled state law but also he would well have wanted to keep Tories and their money in New York. Indeed, the British were setting up the English speaking Ontario Province in Canada to receive such émigrés including a settlement on Lake Ontario alluringly named York – which later took back its original Native Canadian name Toronto. For a picture of life in Toronto in the old days, click HERE .
The role of judicial review came up in various ways at the Constitutional Convention of 1787. For example, with the Virginia Plan, Madison wanted there to be a group of judges to assist the president in deciding to veto a bill or not, much like the New York State Council of Revision – and here too this could be overturned by a supermajority vote in Congress. The Virginia Plan was not adopted; many at the Convention saw no need for an explicit inclusion of judicial review in the final text but they did expect the courts to be able to exercise constitutional review. For example, Elbridge Gerry of Massachusetts (and later of gerrymander fame) said federal judges “would have a sufficient check against encroachments on their own department by their exposition of the laws, which involved a power of deciding on their constitutionality.” Luther Martin of Maryland (though born in New Jersey) added that as “to the constitutionality of laws, that point will come before the judges in their official character…. “ For his part, Martin found that the Constitution as drawn up made for too strong a central government and opposed its ratification.
The Federalist Papers were newspaper articles and essays written by the founding fathers John Jay, James Madison and Alexander Hamilton, all using the pseudonym “Publius,” a tip of the hat to the great Roman historian Publius Cornelius Tacitus; for the relevance of Tacitus across time, try Tacitus by historian Ronald Mellor. A group formed around Hamilton and Jay giving birth to the Federalist Party, the first national political party – it stood for a strong central government run by an economic elite; this quickly gave rise to an opposition group the Democratic Republicans (Jefferson, Burr, …) and the party system was born, somewhat to the surprise of those who had written the Constitution. Though in the end, judicial review was left out of the Constitution, right after the Convention, the need for it was brought up again in the Federalist Papers : in June 1788 Hamilton, already a star, published Federalist 78 in which he argued for the need for judicial review of the constitutionality of legislation as a check on abuse of power by the Congress. In this piece, he also invokes Montesquieu on the relatively smaller role the judiciary should have in government compared to the other two.
Fast forward two centuries: the Federalist Society is a political gate-keeper which was founded in 1982 to increase the number of right-leaning judges on the federal courts. Its founders included such high-profile legal thinkers as Robert Bork (whose own nomination to the Supreme Court was so dramatically scuttled by fierce opposition to him that it led to a coinage, the verb “to bork”). The Society regrouped and since then members Antonin Scalia, John G. Roberts, Clarence Thomas, Samuel Alito and Neil Gorsuch have acceded to the Supreme Court itself. (By the way, Federalist 78 is one of their guiding documents.)
Back to 1788: Here is what Section 1 of Article III of the Constitution states:                     The judicial Power of the United States shall be vested in one Supreme Court, and in such inferior Courts as the Congress may from time to time ordain and establish.
Section 2 of Article III spells out the courts’ purview:                                                           The judicial Power shall extend to all Cases, in Law and Equity, arising under this Constitution, the Laws of the United States, and Treaties made, or which shall be made, under their Authority;—to all Cases affecting Ambassadors, other public Ministers and Consuls;—to all Cases of admiralty and maritime Jurisdiction;—to Controversies to which the United States shall be a Party;—to Controversies between two or more States;—between a State and Citizens of another State;—between Citizens of different States;—between Citizens of the same State claiming Lands under Grants of different States, and between a State, or the Citizens thereof, and foreign States, Citizens or Subjects.
So while Article III lays out responsibilities for the court system, it does not say the courts have the power to review the work of the two other branches of government nor call any of it unconstitutional.
Clause II of Article 6 of the Constitution is known as the Supremacy Clause and states that federal law overrides state law. In particular, this would imply that a federal court could nullify a law passed by a state. But, again, it does not allow for the courts to review federal law.
So there is no authorization of judicial review in the U.S. Constitution. However, given the precedents from the state courts and the positions of Madison, Gerry, Martin, Hamilton et al., it is as though lines from the Federalist 78 such as these were slipped into the Constitution while everyone was looking:
  The interpretation of the laws is the proper and peculiar province of the courts. A constitution is, in fact, and must be regarded by the judges as, a fundamental law. It therefore belongs to them to ascertain its meaning, as well as the meaning of any particular act proceeding from the legislative body. … If there should happen to be an irreconcilable variance between the [Constitution and an act of the legislature], the Constitution ought to be preferred to the statute.
The Supreme Court of the United States (SCOTUS) and the federal court system were created straightaway by the Judiciary Act passed by Congress and signed by President George Washington in 1789.
The first “big” case adjudicated by the Supreme Court was Chisholm v. Georgia (1793). Here the Court ruled in favor of the plaintiff Alexander Chisholm and against the State of Georgia, implicitly ruling that nothing in the Constitution prevented Chisholm from suing the state in federal court. This immediately led to an outcry amongst the states and to the 11th Amendment which precludes a state’s being sued in federal court without that state’s consent. So here the Constitution was itself amended to trump a Court decision.
The precedent for explicit judicial review was set seven years later in 1796 in the case Hylton v. United States: this was the first time that the Court ruled on the constitutionality of a law passed by Congress and signed by the President. It involved the Carriage Act of 1794 which placed a yearly tax of $16 on horse-drawn carriages owned by individuals or businesses. Hylton asserted that this kind of tax violated the powers of federal taxation as laid out in the Constitution while Alexander Hamilton, back in the spotlight, pled the government’s case that the tax was consistent with the Constitution. Chief Justice Oliver Ellsworth and his Court decided in favor of the government, thus affirming the constitutionality of a federal law for the first time; by making this ruling, the Court claimed for itself the authority to determine the constitutionality of a law, a power not provided for in the Constitution but one assumed to come with the territory. This verdict held sway for 101 years; it was overturned in 1895 (Pollock v. Farmers’ Loan and Trust) and then reaffirmed after the passage of the 16th amendment which authorized taxes on income and personal property.
Section 13 of the Judiciary Act of 1789 mandated SCOTUS to order the government to do something specific for a plaintiff if the government is obliged to do so according to law but has failed to do so. In technical terms, the court would issue a writ of mandamus ordering the government to act – mandamus, meaning “we command it,” is derived from the Latin verb mandare.
John Marshall, a Federalist from Virginia, was President John Adam’s Secretary of State. When Jefferson, and not Adams, won the election of 1800, Adams hurried to make federal appointments ahead of Jefferson’s inauguration that coming March; these were the notorious “midnight appointments.” Among them was the appointment of John Marshall himself to the post of Chief Justice of the Supreme Court. Another was the appointment of William Marbury to a judgeship in the District of Columbia. It was Marshall’s job while he was still Secretary of State to prepare and deliver the paperwork and official certifications for these appointments. He failed to accomplish this in time for Marbury and some others; when Jefferson took office he instructed his Secretary of State, James Madison, not to complete the unfinished certifications.
In the “landmark” case Marbury v. Madison (1803), William Marbury petitioned the Court under Section 13 of the Judiciary Act to order the Secretary of State, James Madison, to issue the commission for Marbury to serve as Justice of the Peace in the District of Columbia as the certification was still unfinished thanks to John Marshall, now the Chief Justice. In a legalistic tour de force, the Court affirmed that Marbury was right and that his commission should be issued but ruled against him. John Marshall and his judges declared Section 13 of the Judicial Act unconstitutional because it would (according to the Court) enlarge the authority of the Court beyond that permitted by the Constitution.
Let’s try to analyze the logic of this decision: put paradoxically, the Court could exercise a power not given to it in the Constitution to rule that it could not exercise a power not given to it in the Constitution. Put ironically, it ascribed to itself the power to be powerless. Put dramatically, Marbury, himself not a lawyer, might well have cheered on Dick the Butcher who has the line “let’s kill all the lawyers” in Henry VI, Part 2 – but all this business is less like Shakespeare and more like Aristophanes.
Declaring federal laws unconstitutional did not turn into a habit in the 19th century. The Marshall court itself did not declare any other federal laws to be unconstitutional but it did find so in cases involving state laws. For example, Luther Martin was on the losing side in McCulloch v. Maryland (1819) when the Court declared a Maryland state law levying a tax on a federally authorized national bank to be unconstitutional.
The story doesn’t end there.
Although another law wouldn’t be ruled unconstitutional by the Supreme Court until 1857, the two plus centuries since Marbury would see a dramatic surge in judicial review and outbreaks of judicial activism on the part of courts both left-wing and right-wing. There is a worrisome tilt toward increasing judicial power: “worrisome” because things might not stop there; the 2nd Book of Samuel (the last Judge of the Israelites) is followed by the 1st Book of Kings, something to do with the need to improve national defense. Affaire à suivre. More to come.

Voting Arithmetic

Voting is simple when there are only two candidates and lots of voters: people vote and one of the candidates will simply get more votes than the other (except in the very unusual case of a tie when a coin is needed). In other words, the candidate who gets a majority of the votes wins; naturally, this is called majority voting. Things start to get complicated when there are more than two candidates and a majority of the votes is required to elect a candidate; in this case, multiple ballots and lots of horse trading can be required until a winner emerges.

The U.S. has continued many practices inherited from England such as the system of common law. One of the most important of these practices is the way elections are run. In the U.K. and the U.S., elections are decided (with some exceptions) by plurality : the candidate who polls the largest number of votes is the winner. Thus if there are 3 candidates and one gets 40% of the votes while the other two get 30%, the one with 40% wins – there is no runoff. This is called plurality voting in the U.S. and relative majority voting in the U.K.

The advantages of plurality voting are that it is easy to tabulate and that it avoids multiple ballots. The glaring disadvantage is that it doesn’t provide for small party representation and leads to two party dominance over the political system – a vote for a third party candidate is a vote thrown away and counts for nothing. As a random example, look at the presidential election in the Great State of Florida in 2000: everything would have been the same if those who voted for Ralph Nader had simply stayed home and not voted at all. As a result, third parties never get very far in the U.S. and if they do pick up some momentum, it is quickly dissipated. In England, it is similar with Labour and the Conservatives being the dominant parties since the 1920’s. Is this handled differently elsewhere? Mystère.

To address this problem of third party representation, countries like Italy and Holland use proportional voting for parliamentary elections. For example, suppose there are 100 seats in parliament, that the 3 parties A, B and C propose lists of candidates and that party A gets 40% of the votes cast and B and C each get 30%; then A is awarded 40 seats in parliament while B and C each get 30 seats. With this system, more than two parties will be represented in parliament; if no one party has a majority of the seats, then a coalition government will be formed.

Another technique used abroad is to have majority voting with a single runoff election. Suppose there are 4 parties (A,B,C,D) with candidates for president; a first election is held and the two top vote getters (say, A and B) face off in a runoff a week or two later. During this time, C and D can enter into a coalition with A or B and their voters will now vote for the coalition partner. So those minority votes in the first round live to fight another day. Also that president, once elected, now represents a coalition which mitigates the kind of extreme Manichaeism of today’s U.S. system. It can lead to some strange bedfellows, though. In the 2002 presidential election in France, to stop the far-right National Front candidate Jean-Marie LePen from winning the second round, the left wing parties had to back their long time right wing nemesis Jacques Chirac.
However, one might criticize and find fault with these countries and their election systems, the fact is that voter participation is far higher than that in the U.S.

For U.S. presidential elections, what with the Electoral College and all that, “it’s complicated.” For Hamilton, Madison and others, the Electoral College would serve as an additional buffer between the masses and the government: one way this was to be achieved was by means of the “faithless elector,” one who does not vote for the candidate he pledged to – this stratagem would overturn a mass vote for a potential despot. This was considered a feature and not a bug; this feature is still in force and some pledged electors do employ it – in the 2016 election, seven electors voted against their pledged candidates, two against Trump and five against Clinton. But, except for faithless electors, how else could the Electoral College stymie the will of the people? Mystère.

That the Electoral College can indeed serve as a buffer between the presidency and the population has been proven by four elections (1876, 1888, 2000, 2016) where the Democratic candidate carried the popular vote but the Republican candidate obtained a majority in the Electoral College; most scandalously, in the 1876 election, in a backroom deal, 20 disputed electoral votes were awarded to the Republican candidate Rutherford B. Hayes to give him a majority of 1 vote in exchange for the end of Reconstruction in the South – “probably the low point in our republic’s history” to cite Gore Vidal.

That the Electoral College can indeed serve as a buffer between the presidency and the population has also been proven by the elections of 1800 and 1824 where no candidate had a majority of the electoral vote; in this case, the Constitution specifies that the election is to be decided in the House of Representatives with each state having one vote. In 1824, the populist candidate, Andrew Jackson, won a plurality both of the popular vote and the electoral vote, but on the first ballot a majority of the state delegations, cajoled by Henry Clay, voted for the establishment candidate John Quincy Adams. In the election of 1800, Jefferson and Burr were the top electoral vote getters with 73 votes each. Jefferson won a majority in the House on the 36th ballot, his victory engineered by Hamilton who disliked Jefferson but loathed Burr – we know how this story will end, unfortunately.

For conspiracy theorists, it is worth pointing out that not only were all four candidates who won the popular vote but lost the electoral vote Democrats but that three of the four were from New York State as was Aaron Burr.

The most obvious shortcoming of the Electoral College system is that it is a form of gerrymandering that gives too much power and representation to rural states at the expense of large urban states; in English terms, it creates “rotten boroughs.” For example, using 2018 figures, California has 55 electoral votes for 39,776,830 people and Wyoming has 3 votes for 573,720; so, if one does the math, 1 vote for president in Wyoming is worth 3.78 votes in California. Backing up, let us “show our work.” When we solved this kind of problem in elementary school, we used the rule, the “product of the means is equal to the product of the extremes”; thus, using the camel case dear to programmers, we start with the proportion

     votesWyHas : itsPopulation :: votesCaShouldHave : itsPopulation

where : is read “is to” and “::” is read “as.” Three of the four terms have known values and so the proportion becomes

     3 : 573,720 :: votesCaShouldHave : 39,776,830

The above rule says that the product of the inner two terms is equal to that of the outer two terms. The term votesCaShouldHave is the unknown so let us call it ; let us apply the rule using * as the symbol for multiplication and let us solve the following equation:

     3 * 39,776,830 = 573,720 * x

which yields the number of electors California would have, were it to have as many electors per person as Wyoming does; this simplifies to

     x = (3 * 39,776,830)/ 573,720 = 207.99

So California would have to have 207.99 electors to make things fair; dividing this figure by 55, we find that 1 vote in Wyoming is worth 207.99/55 = 3.78 votes in California. This is a most undemocratic formula for electing the President. But things can get worse. If the race is thrown into the House, since California has 69.33 times as many people as Wyoming, the ratio jumps to 1.0 to 69.33, making this the most undemocratic way of selecting a President imaginable. For state by state population figures, click  HERE .

One simple way to mitigate the undemocratic nature of the Electoral College would be to eliminate the 2 electoral votes that correspond to each state’s 2 senators. The would change the Wyoming/California ratio and make 1 vote in the Equality State worth only 1.26 votes in the Golden State. With this counting technique, Trump would still have won the 2016 presidential election 236 to 195 (much less of a “massive landslide” than 304 to 227, the official tally) but Al Gore would have won the 2000 race, 228 to 209, even without Florida (as opposed to losing 266 to 271).

To tally the Electoral College vote, most states assign all their votes (via the “faithful” pledged electors) to the plurality winner for president in that state’s presidential tally. Nebraska and Maine, the two exceptions, use the congressional district method which assigns the two votes that correspond to Senate seats to the overall plurality winner and one electoral vote to the plurality winner in each congressional district in the state. By way of example, in an election with 3 candidates, suppose a state has 3 representatives (so 5 electoral votes) and that one candidate obtains 50% of the total vote and the other two 25% each; then if each candidate is the plurality winner in the vote from exactly one congressional district, the top vote-getter is assigned the 2 votes for the state’s senators plus 1 vote for the congressional district he or she has won and the other two candidates receive 1 electoral vote each. This system yields a more representative result but note that gerrymandering will still impact who the winner is in each congressional district. What is intrinsically dangerous about this practice, though, is that, if candidates for more than two parties are running, it can dramatically increase the chances that the presidential election will be thrown into the House of Representatives. In this situation, the Twelfth Amendment ordains that the 3 top electoral vote-getters must be considered for the presidency and so, if this method had been employed generally in the past, the elections of 1860 (Lincoln), 1912 (Wilson) and 1992 (Clinton) could well have given us presidents Stephen Douglas, Teddy Roosevelt and Ross Perot.

The congressional district method is a form of proportional voting. While it could be disastrous in the framework of the Electoral College system for electing a U.S. president, proportional voting itself is successfully implemented in many countries to achieve more equitable outcomes than that furnished by plurality voting and the two party system.

A voting system which is used in many American cities such as Minneapolis, and Oakland and in countries such as Australia and Ireland is known as ranked choice voting or instant-runoff voting. Voters in Maine recently voted for this system to be used in races for seats in the U.S. House of Representatives and in party primaries. Ranked choice voting emulates runoff elections but in a single round of balloting; it is a much more even-handed way to choose a winner than plurality voting. Suppose there are 3 candidates – A, B and C; then, on the ballot, each voter lists the 3 candidates in the order of that voter’s preference. For the first round, the count is made of the number of first place votes each candidate received; if for one candidate that number is a majority, that candidate wins outright. Otherwise, the candidate with the least number of first place votes, say A, is eliminated; now we go into the “second round” with only B and C as candidates and we add to B’s first place total the number of ballots for A that listed B as second choice and similarly for C. Now, except in the case of a tie, either B or C will have a clear majority and will be declared the winner. This will give the same result that staging a runoff between B and C would yield. With 3 candidates, at most 2 rounds are required; if there were 4 candidates, up to 3 rounds could be needed, etc.

Most interestingly, in the Maine 2018 election, in one congressional district, no candidate for the House of Representatives gathered an absolute majority on the first round but a candidate who received fewer first place votes on that first round won on the second round when he caught up and surged ahead because of the number of voters who made him their second choice. (For details, click  HERE ).

Naturally, all this is being challenged in court by the losing side. However, for elections, Section 4 of Article 1 of the U.S. Constitution leaves implementation to the states for them to carry out in the manner they deem fit – subject to Congressional oversight but not to judiciary oversight:

   “The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations, except as to the Places of chusing (sic) Senators.”

N.B. In this article of the Constitution, the senators are an exception because at that time the senators were chosen by the state legislatures and direct election of senators by popular vote had to wait for 1913 and the 17th Amendment.

At the first legal challenge to it, the new Maine system was upheld vigorously in the United States District Court based in large part on Section 4 of Article 1 above. For the ruling itself, click  HERE . But the story will not likely end so simply.

This kind of voting system is also used by the Academy of Motion Picture Arts and Sciences to select the nominees in each category, but they call it preferential voting. So to determine five directors, they apply the elimination process until only five candidates remain.

With ranked choice voting, in Florida, in that 2000 election, if the Nader voters listed Ralph Nader first, Al Gore (who was strong on the environment) second and George Bush third and if all Pat Buchanan voters listed Buchanan first, Bush second and Gore third, Gore would have carried the day by over 79,000 votes in the third and final round.

However one might criticize and find fault with countries like Australia and Ireland and their election systems, the fact is that voter participation is far higher than that in the U.S. For numbers, click  HERE .

Ranked voting systems are not new and have been a topic of interest to social scientists and mathematicians for a long time now. The French Enlightenment thinker, the Marquis de Condorcet, introduced the notion of the Condorcet Winner of an election – the candidate who would beat all the other candidates in a head-to-head election based on the ballot rankings; he also is the author of Condorcet’s Paradox – that a ranked choice setup might not produce a Condorcet winner. To analyze this situation, the English mathematician Charles Lutwidge Dodgson introduced the Dodgson Method, an algorithm for measuring how far the result for a given election using ranked choice voting is from producing a Condorcet Winner. More recently, the mathematician and economist Kenneth Arrow authored Arrow’s Paradox which shows that there are ways in which ranked voting can sometimes be gamed by using the idea behind Condorcet’s Paradox: for example, it is possible that in certain situations, voters can assure the victory of their most preferred candidate by listing that candidate 2nd and not 1st – the trick is to knock out an opponent one’s favorite would lose to in a head to head election by favoring a weaker opponent who will knock out the feared candidate and who will then be defeated in the final head to head election. For his efforts, Arrow was awarded a Nobel Prize; for his efforts, Condorcet had a street named for him in Paris (click  HERE  ); for his efforts, Charles Lutwidge Dogson had to latinize his first and middle names, then reverse them to form the pen name Lewis Carroll, and then proceed to write Alice in Wonderland and Jabberwocky, all to rescue himself from the obscurity that usually awaits mathematicians. For a detailed but playful presentation on paradoxes and ranked choice voting, click  HERE .

The Constitution – U.S Scripture III

In 1787, the Confederation Congress called for a Constitutional Convention with the goal of replacing the Articles of Confederation with a form of government that had the central power necessary to lead the states and the territories.
This had to be a document very different from the Iroquois Great Law of Peace, from the Union of Utrecht and from the Articles of Confederation themselves. It had to provide for a centralized structure that would exercise legislative and executive power on behalf of all the states and territories. Were there existing historical precedents for a written document to set up a social contract of government? Mystère.
In antiquity, there was “The Athenian Constitution”; but this text, credited to Aristotle and his students at the Lyceum, is not a founding document; rather it is an after-the-fact compilation of the workings of the Athenian political system.
In the Middle Ages there was the Magna Carta of 1215 with its legal protections such as trial by jury and its limitations on the power of the king. Though the Magna Carta itself was quickly annulled by a bull of the crusade-loving Pope Innocent III as “illegal, unjust, harmful to royal rights and shameful to the English people,” it served as a template for insulating the citizen from the power of the state.
There is a book entitled “The English Constitution” but this was published by Walter Bagehot in the latter part of the 19th century and, like “The Athenian Constitution,” it is an account of existing practices and procedures rather than any kind of founding document. This is the book that the 13 year old Elizabeth is studying when taking history lessons with the provost of Eton in the TV series “The Crown.”
For an actual example of a nation’s constitution that pre-dates 1789, one has to go back to 1600, the year that the Constitution of San Marino was adopted. However, there is no evidence that the founding fathers knew anything of this at all. Since that time, this document has been the law of this land-locked micro-state and it has weathered many storms; most recently, during the Cold War, it gave San Marino its 15 minutes of fame when the citizens elected a government of Communist Party members and then peacefully voted them out of office twelve years later. For an image of St. Martinus, stonemason and founding father of this, the world’s smallest republic, click  HERE .
The English Bill of Rights of 1689, an Act of Parliament, is a constitutional document in that it transformed an absolute monarchy into a constitutional monarchy. This is the key role of a constitution – it tempers or replaces traditional monarchy based on the Divine Right of Kings with an explicit social contract. This sharing of power between the monarch and the parliament made England the first Constitutional Monarchy – in simple terms, the division of roles made the parliament the legislature and made the king the executive. To get a sense of how radical this development was, it took place only a few years after Louis XIV of France reportedly exclaimed “L’Etat, c’est moi.”
With independence brewing, the Continental Congress in May 1776 directed the colonies to draw up constitutions for their own governance. The immediate precursors to the U.S. Constitution then were the state constitutions of 1776, 1777 and 1780, born of the break with Great Britain.
An important influence on this generation of constitutional documents was the work of the French Enlightenment philosopher Montesquieu. In his Spirit of the Laws (1748) Montesquieu analyzed forms of government and how different forms matched with different kinds of nations – small nations best being republics, medium sized nations best served by a constitutional monarchy and very large ones best being empires. His analysis broke government down into executive, legislative and (to a lesser extent) judicial powers and he argued that, to avoid tyranny, these should be separate and independent of each other so that the power of any one of these would not exceed the combined power of the other two.
The state of Connecticut did not adopt a new constitution but continued with its (sometimes updated) Royal Charter of 1662. In the matter of religious freedom, in Connecticut, the Congregational Church was effectively the established state religion until the Constitution of 1818. Elsewhere, the antidisestablishmentarians generally lost out. For example, in the case of New York, Georgia, Rhode Island, Pennsylvania and Massachusetts, the state constitution guaranteed freedom of religion; in New Hampshire, the Constitution of 1776 was silent on the subject, but “freedom of conscience”  was guaranteed in the expanded version of 1784.  In Delaware’s case, it prohibited the installation of an established religion; in Virginia’s case, it took the Virginia Statute for Religious Freedom, written by Thomas Jefferson in 1786, to stave off the threat of an established church. On the other hand, the Maryland Constitution only guaranteed freedom of religion to “persons professing the Christian religion” (which was the same as in Maryland’s famous Toleration Act of 1649, the first step in the right direction in the colonies). In the 1776 document, Anglicanism is the established religion in South Carolina – this was undone in the 1778 revision. In the North Carolina Constitution, Article 32 affirms “That no person who shall deny the being of God, or the truth of the Protestant religion … shall be capable of holding any office … within this State”; New Jersey’s Constitution had a similar clause. It seems that from a legal point of view, a state still has the authority to have its own established or favored religion and attempts to move in this direction are still being made in North Carolina and elsewhere – the First Amendment explicitly only prohibits Congress from setting up an established religion for the country as a whole.
The challenges confronting the Constitutional Convention in Philadelphia in 1787 were many – to craft a system with a sufficiently strong central authority but not one that could morph into a dictatorship or mob rule, to preserve federalism and states’ rights (in particular, for the defense of the peculiar institution of slavery), to preserve popular sovereignty through a system of elections, etc. Who, then, rose to the occasion and provided the intellectual and political drive to get this done? Mystère.
Thomas Jefferson was the ambassador to France, John Adams was the ambassador to Great Britain and neither attended the Convention. Benjamin Franklin was one of the few who took a stand for the abolition of slavery but to no avail; Alexander Hamilton had but a bit part and his main initiative was roundly defeated. George Washington took part only at James Madison’s urging (but did serve as president of the Convention). But it is Madison who is known as the Father of the Constitution.
Madison and the others were keen readers of the Roman historian Publius Cornelius Tacitus who pitilessly described the transformation of the Roman Senatorial class from lawmakers into sniveling courtiers with the transformation of the Roman Republic into the Roman Empire; Montesquieu also wrote about the end of the Roman Republic. On the other hand, the rumblings leading up to the French Revolution could be heard and the threat of mob rule was not unrealistic. So the fear of creating a tyrannical regime was very much with them.
Madison’s plan for a strong government that would not turn autocratic was, like some of the state constitutions, based on the application of ideas of Montesquieu. In fact, in Federalist No. 47, Madison (using the Federalist pseudonym Publius) developed Montesquieu’s analysis of the separation of powers further and enunciated the principle of “checks and balances.”
For his part, Hamilton pushed for a very strong central government modeled on the English system with his British Plan; however, this plan was not adopted, nor were the plans for structuring the government proposed by Virginia and by New Jersey. Instead, a balance between large and small states was achieved by means of the Connecticut Compromise: there would be a bicameral legislature with an upper house, the Senate, having two senators from each state; there would be a lower house, the House of Representatives, with each state having a number of representatives proportional to its population. While the senators would be appointed by the state legislatures, the representatives would be chosen by popular vote (restricted to white men of property, of course).
This bicameral setup, with its upper house, was designed to reduce the threat of mob rule. However, it also brought up the problem of computing each state’s population for the purpose of determining representation in the House of Representatives. The resulting Three-Fifths Compromise stipulated that 3/5ths of the slave population in a state would count toward the state’s total population for this computation. This compromise created the need for an electoral college to elect the president, since enslaved African Americans would not each have three-fifths of a vote! So the system of electors was introduced and each state would have one elector for each member of Congress.
Far from abolishing slavery, Article 1, Section 9, Clause 1 of the Constitution prohibited Congress from making any law that would interfere with the international slave trade until 1808 at the earliest. However, Jefferson had stood for ending this traffic since the 1770’s and, in his second term in 1807, the Act Prohibiting the Importation of Slaves was passed and the ban was initiated the following year.  However, the ban was often violated and some importation of slaves continued into the Civil War. In 1807, the British set up a similar ban on the slave trade in the Empire and the British Navy actively enforced it against ships of all nations off the coast of West Africa and elsewhere, technically classifying slave traders as pirates; this was an important impediment to the importation of slaves into the United States.
For Hamilton, Madison and others, the Electoral College would serve as an additional buffer between the masses and the government: one way this was to be achieved was by means of the “faithless elector,” one who does not vote for the candidate he pledged to – this stratagem would overturn a mass vote for a potential despot. This was considered a feature and not a bug; this feature is still in force and some pledged electors do employ it – in the 2016 election, seven electors voted against their pledged candidates, two against Trump and five against Clinton.
The Constitution left it to the states to determine who is eligible to vote. With some exceptions here and there at different times, the result was that only white males who owned property were eligible to vote. This belief in the “divine right of the propertied” has its roots in the work of John Locke; it also can be traced back to a utopian composition published in 1656 by James Harrington; in The Commonwealth of Oceana he describes an egalitarian society with an ideal constitution where there is a limit on how much property a family can own and rules for distributing property; there is a senate and there are elections and term limits. Harrington promulgated the idea of a written constitution arguing that a well-designed, rational document would curtail dangerous conflicts of interest. This kind of interest in political systems was dangerous back then; Oliver Cromwell blocked publication of Harrington’s work until it was dedicated to the Lord Protector himself; with the Stuart Restoration, Harrington was jailed in the Tower of London and died soon after as a result of mistreatment. For a portrait, click HERE .
In any case, it wasn’t until 1856 that even universal suffrage for white males became established in the U.S. For the enfranchisement of rest of the population, it took the Civil War and constant militancy up to and during WWI. A uniform election day was not fixed until 1845 and there are no real federal guidelines for election standards. This issue is still very much with us, as demonstrated by a wave of voter suppression laws in the states newly released from the strictures of the Voting Rights Act by the Roberts’ Court with the 2013 decision in Shelby County v. Holder.
Finally, a four page document entitled Constitution of the United States of America was submitted to the states in September 1787 for ratification. This process required nine of the thirteen states; the first to ratify it was Delaware and the ninth was New Hampshire. There was no Bill of Rights and no provision for judicial review of legislation. Political parties were not expected to play a significant role and the provisions for the election of president and vice-president were so clumsy that they exacerbated the electoral crisis of 1800 which ultimately led to the duel between Aaron Burr and Alexander Hamilton.
The Confederation Congress declared the Constitution ratified in September 1788 and the first presidential election was held. Congress was seated and George Washington became President in the spring of 1789.
In American life, the Constitution has truly become unquestionable, sacred scripture and the word unconstitutional has the force of a curse. As a result, to a large extent, Americans are frozen in place and are not able to be forward looking in dealing with the myriad new kinds of problems, issues and opportunities that contemporary life creates.
For example, the Constitution provides for an Amendment process that requires ratification by 3/4ths of the states. When there were 13 states huddled together on the Eastern Seaboard, this worked fine and the first 10 amendments, The Bill of Rights, were passed quickly after the Constitution was adopted. However, today this process is most cumbersome. For example, any change in the Electoral College system would require an amendment to the Constitution; but any 13 states could block an attempt at change and the 13 smallest states, which have barely 4% of the population, would not find it in their interest to make any such change, alas. Another victim is term limits for members of Congress. It is in states’ interest to have senators and representatives with seniority so they can accede to powerful committee chairmanships etc; this is the old Dixiecrat strategy that kept Strom Thurmond in the Senate until he was over 100 years old – but then the root of the word senator is the Latin senex which means “old man.” The Constitution does provide for a second way for it to be amended: 34 state legislatures would have to pass applications for a constitutional convention to deal with, say, term limits; this method has never been used successfully, but a group “U.S Term Limits” is trying just that.
The idea of judicial review of laws passed by Congress did come up at the Convention. Madison first wanted there to be a set of judges to assist the president in deciding to veto a bill or not. In the end, nothing was set down clearly in the Constitution and the practice of having courts review constitutionality came about by a kind of judicial fiat when John Marshall’s Supreme Court ruled a section of an act of Congress to be unconstitutional. Today, any law passed by Congress and signed by the President has to go through an interminable process of review in the courts and in the end, the law means only what the courts say it means. Contrast this with the U.K. where the meaning of a law is what Parliament says it is. As a result with the Supreme Court politicized the way it is, law is actually made today by the Court and the Congress just licks its wounds. The most critical decisions are thus made by 5 unelected career lawyers. Already in 1921, seeing what was happening in the U.S., the French jurist Edouard Lambert coined the phrase “gouvernement des juges” for the way judges privileged their personal slant on cases before them to the detriment of a straightforward interpretation of the letter and spirit of the law.
The reduction of the role of Congress and the interference of the courts have also contributed to the emergence of an imperial presidency. The Constitution gives only Congress the right to levy tariffs or declare war; but now the president imposes tariffs, sends troops off to war, and governs mostly by executive order. Much of this is “justified” by a need for expediency and quick decision making in running such a complex country – but this is, as Montesquieu and others point out, the sort of thing that led to the end of the Roman Republic.

The Articles – U.S. Scripture II

The second text of U.S. scripture, the Articles of Confederation, gets much less attention than the Declaration of Independence or the Constitution. Still it set up the political structure by which the new country was run for its first thirteen years; it provided the military structure to win the war for independence; it furnished the diplomatic structure to secure French support during that war and then to reach an advantageous peace treaty with the British Empire.
The Iroquois Great Law of Peace and the Albany Plan of Benjamin Franklin that it inspired are certainly precursors of the Articles of Confederation.  In fact, the First Continental Congress set up a meeting in Albany NY in August 1775 with the Iroquois. The colonists informed the Iroquois of the possibility of the colonies’ breaking with Britain, acknowledged the debt they owed them for their example and advice, and presumably tried to test whether the Iroquois would support the British in the case of an American move for independence. In the end, the Iroquois did stay loyal to the British (the Royal Proclamation of 1763 would have had something to do with that). The French Canadians also stayed loyal to the British; in this case too, it was likely preferring “the devil you know.”
Were there other forerunners to the Articles? Mystère.
Another precursor of the Articles was the Dutch Republic’s Union of Utrecht of 1579, which created a confederation of seven provinces in the north of the Netherlands. Like that of the Iroquois, the Dutch system left each component virtually independent except for issues like the common defense. This was not a democratic system in the modern sense in that each province was controlled by a ruling clique called the regents. For affairs of common interest, the Republic had a governing body called the Staaten (parliament) with one representative from each province. Henry Hudson’s voyage in 1609 was financed by the Staaten and he loyally named the westernmost outpost of New York City Staaten Eylandt. (This is not inconsistent with the old Vaudeville joke that the name originated with the near-sighted Henry asking “Is dat an Eyelandt?!!!” in his broken Dutch.) Hudson stopped short of naming the mighty river he navigated for himself; the Native Americans called it the Mahicanituck and the Dutch simply called it the North River.
Like the American colonists but two hundred years earlier, to achieve independence the citizens of the Dutch Republic had to rebel against the mightiest empire of the time, in this case that of Philipp II of Spain. However, the Dutch Republic in its Golden Age was the most prosperous country in Europe and among the most powerful, proving its military mettle in the Anglo-Dutch Wars of the 17th century – all of which gave rise to the unflattering English language expressions Dutch Courage (bravery fueled by alcohol), Dutch Widow (a woman of ill repute), Dutch Uncle (someone not at all avuncular), Dutch Comfort (a comment like “things could be worse”) and, of course, Dutch Treat. The Dutch Republic was also remarkable for protecting civil liberties and religious freedom, keys to the domestic tranquility that did find their way into the U.S. Bill of Rights. For a painting by the Dutch Master Abraham Storck of a scene from the Four Days Battle during the Second Anglo-Dutch War, click HERE.
The Articles of Confederation were approved by the Second Continental Congress on Nov 15, 1776. Though technically not ratified by the states until 1781, the Articles steered the new country through the Revolutionary War and continued to be in force until 1789. The Articles embraced the federalism of the Iroquois Confederation and the Dutch Republic; they rejected the principle of the Divine Right of Kings in favor of republicanism and they embraced the idea of popular sovereignty affirming that power resides with the people
The Congress of the Confederation had a unicameral legislature (like the Staaten and Nebraska). It had a presiding officer referred to as the President of the United States who organized the deliberations of the Congress, but who did not have executive authority. In all, there were ten presidents, John Hancock and Richard Henry Lee among them. John Hanson, a wealthy landowner and slaveholder from Maryland, was the first president and so wags claim that “John Hanson” and not “George Washington” is the correct answer to the trivia question “who was the first U.S. president” – by the way the answer to the question “which president first called for a Day of Thanksgiving on a Thursday in November” is also “John Hanson.” For a statue of the man, click HERE
The arrangement was truly federal: each state had one vote and ordinary matters required a simple majority of the states. The Congress could not levy taxes itself but depended on the states for its revenue. On the other hand, Congress could coin money and conduct foreign policy but decisions on making war, entering into treaties, regulating coinage, and some other important issues required the vote of nine states in the Congress.
Not unsurprisingly, given the colonists’ opposition to the Royal Proclamation of 1763, during the Revolutionary War the Americans took action to wrest the coveted land west of the Appalachians away from the British. George Rogers Clark, a general in the Virginia Militia (and older brother of William of “Lewis and Clark” fame) is celebrated for the Illinois Campaign and the captures of Kaskaskia (Illinois) and Vincennes (Indiana). For the Porte de Vincennes metro stop in Paris, click HERE.
As the French, Dutch and Spanish squabbled with the English over the terms of a treaty to end the American War of Independence and dithered over issues of interest to these imperial powers ranging from Gibraltar to the Caribbean to the Spice Islands to Senegal, the Americans and the English put together their own deal (infuriating the others, especially the French). This arrangement ceded the land east of the Mississippi and South of the Great Lakes (except for Florida) to the newly born United States. The Florida territory was transferred back once again to Spain. The French had wanted all that land east of the Mississippi and West of the Appalachians to be ceded to its ally Spain who also controlled the Louisiana Territory at this time. Given how Spain returned the Louisiana Territory to France by means of a secret treaty twenty years later, the bold American diplomatic dealings in the Treaty of Paris proved to be prescient; the Americans who signed the treaty with England were Benjamin Franklin, John Adams and John Jay.
The treaty with England was signed in 1783 and ratified by the Confederation Congress then sitting at the Maryland State House in Annapolis on January 14, 1784
However, hostilities between American militias and British and Native American forces continued after Cornwallis’ defeat at Yorktown and even after the signing of the treaty that officially ended the war; in fact, the British did not relinquish Fort Detroit and surrounding settlements until Jay’s Treaty which took effect in 1796. Many thought this treaty made too many concessions to the British on commercial and maritime matters and, for his efforts, Jay was hanged and burned in effigy everywhere by anti-Federalists. Jay reportedly joked that he could find his way across the country by the light of his burning effigies. Click HERE for a political cartoon from the period.

A noted achievement of the Confederation Congress was the Ordinance of 1787 (aka the Northwest Ordinance), approved on July 13, when the Congress was seated at Federal Hall in New York City. The Northwest Territory was comprised of the five territories of Illinois, Michigan, Wisconsin, Indiana, Ohio – the elementary school mnemonic was “I met Walter in Ohio.” Four of these names are Native American in origin; Indiana is named for the Indiana Land Company, a group of real estate investors. The Ordinance outlawed slavery in these areas (but it did include a fugitive slave clause), provided a protocol for territories’ becoming states, acknowledged the land rights of Native Americans, established freedom of navigation on lakes and rivers, established the principle of public education (including universities), … . In fact no time was wasted: Ohio University was chartered in 1787; it is located in Athens (naturally) and today has over 30,000 students. The Ordinance was re-affirmed when the Constitution replaced the Articles of Confederation.

With all these successes in war and in making peace, what drove the Americans to abandon the proven formula of a confederation of tribes or provinces and seek to replace it? Again, mystère.

While the Articles of Confederation were successful when it came to waging war and preparing for new states, it was economic policy and domestic strife that made the case for a stronger central government.

Under the Articles of Confederation, the power to tax stayed with the individual states; in 1781 and again in 1786, serious efforts were made to amend the Articles so that the Confederation Congress itself could levy taxes; both efforts failed leaving the Congress without control over its own finances. During and after the war, both the Congress and individual states printed money, money that soon was “not worth a continental.”

In 1785 and 1786, a rebellion broke out in Western Massachusetts, in the area around Springfield; the leader who emerged was Daniel Shays, a farm worker who had fought in the Revolution (Lexington and Concord, Saratoga, …) and who had been wounded  – but Shays had never been paid for his service in the Continental Army and he was now being pursued by the courts for debts. He was not alone and petitions by yeoman citizens for relief from debts and taxes were not being addressed by the State Legislature. The rebels shut down court houses and tried to seize the Federal Armory in Springfield; in this, they were thwarted only by an ad hoc militia raised with money from merchants in the east of the state. After, many rebels including Shays, swiftly escaped to neighboring states such as New Hampshire and New York, out of reach for the Massachusetts militia.

Shays’ Rebellion shook the foundations of the new country and accelerated the process that led to the Constitutional Convention of 1787. It dramatically highlighted the shortcomings of such a decentralized system in matters of law and order and in matters economic; in contrast, with the new Constitution, Washington as President was able to lead troops to put down the Whiskey Rebellion and Hamilton as Secretary of the Treasury was able to re-organize the economy (national bank, assumption of states’ debts, protective tariffs, …). Click HERE for a picture of George back in the saddle; as for “Hamilton” tickets, just wait for the movie.

The Articles continued to be the law of the land into 1789: the third text of U.S. scripture, the U.S. Constitution, was ratified by the ninth state New Hampshire on June 21, 1788 and the Confederation Congress established March 4, 1789 as the date for the country to begin operating under the new Constitution.
How did that work out? More to come. Affaire à suivre.

The Declaration – U.S. Scripture I

There are three founding texts for Americans, texts treated like sacred scripture. The first is the Declaration of Independence, a stirring document both political and philosophical; in schools and elsewhere, it is read and recited with religious spirit. The second is the Articles of Confederation; a government based on this text was established by the Second Continental Congress; despite the new country’s success in waging the Revolutionary War and in reaching an advantageous peace treaty with the British Empire, this document is not venerated by Americans in quite the same way because this form of government was superseded after thirteen years by that of the third founding text, the Constitution. These three texts play the role of secular scripture in the United States; in particular, the Constitution, although only 4 pages long without amendments, is truly revered and quoted like “chapter and verse.”

Athena, the Greek goddess, is said to have sprung full grown from the head of Zeus (and in full armor, to boot); did these founding texts just emerge from the heads and pens of the founding fathers? In particular, there is the Declaration of Independence. Did it have a precursor? Was it part of the spirit of the times, of the Zeitgeist? Mystère.

Though still under British rule in 1775 when hostilities broke out at Lexington and Concord, the colonies had had 159 years of self-government at the local level: they elected law makers, named judges, ran courts and collected taxes. Forward looking government took root early in the colonies. Already in 1619, in Virginia, the House of Burgesses was set up, the first legislative assembly of elected representatives in North America; in 1620, the Pilgrims drew up the Mayflower Compact before even landing; the 1683, the colonial assembly in New York passed the Charter of Liberties. A peculiar matrix evolved where there was slavery and indentured servitude on the one hand and progress in civil liberties on the other (one example, the trial of John Peter Zenger in New York City and the establishment of Freedom of the Press).

In fact, the period from 1690 till 1763 is known as the “period of salutary neglect,” where the British pretty much left the colonies to fend for themselves – the phrase “salutary neglect” was coined by the British parliamentarian Edmund Burke, “the father of modern conservatism.” Salutary neglect was abandoned at the end of the French and Indian War (aka The Seven Years War) which was a “glorious victory” for the British but which left them with large war debts; their idea was to have the Americans “pay their fair share.”

During the run-up to the French and Indian War, at the Albany Convention in 1754, Benjamin Franklin proposed the Albany Plan, which was an early attempt to unify the colonies “under one government as far as might be necessary for defense and other general important purposes.” The main thrust was mutual defense and, of course, it would all be done under the authority of the British crown.

The Albany Plan was influenced by the Iroquois’ Great Law of Peace, a compact that long predated the arrival of Europeans in the Americas; this compact is also known as the Iroquois Constitution. This constitution provided the political basis for the Haudenosaunee, aka the Iroquois Confederation, a confederacy of six major tribes. The system was federal in nature and left each tribe largely responsible for its own affairs. Theirs was a very egalitarian society and for matters of group interest such as the common defense, a council of chiefs (who were designated by senior women of their clans) had to reach a consensus. The Iroquois were the dominant Indian group in the northeast and stayed unified in their dealings with the French, British and Americans. In a letter to a colleague in 1751, Benjamin Franklin acknowledged his debt to the Iroquois with this amazing admixture of respect and condescension:

”It would be a strange thing if six nations of ignorant savages should be capable of forming such a union, and yet it has subsisted for ages and appears indissolvable, and yet a like union should be impractical for 10 or a dozen English colonies.”

The Iroquois Constitution was also the subject of a groundbreaking ethnographic monograph. In 1724, the French Jesuit missionary Joseph-François Lafitau published a treatise on Iroquois society, Mœurs des sauvages amériquains comparées aux mœurs des premiers temps (Customs of the American Indians Compared with the Customs of Primitive Times), in which he describes the workings of the Iroquois system and compares it to the political systems of the ancient world in an attempt to establish a commonality shared by all human societies. Lafitau admired this egalitarian society where each Iroquois, he observed, views “others as masters of their own actions and of themselves” and each Iroquois lets others “conduct themselves as they wish and judges only himself.”

The pioneering American anthropologist Lewis Henry Morgan, who studied Iroquois society in depth, was also impressed with the democratic nature of their way of life, writing “Their whole civil policy was averse to the concentration of power in the hands of any single individual.” In turn, Morgan had a very strong influence on Frederick Engel’s The Origin of the Family, Private Property, and the State: in the Light of the Researches of Lewis H. Morgan (1884). Apropos of the Iroquois Constitution, Engels (using “gentile” in its root Latin sense of “tribal”) waxed lyric and exclaimed “This gentile constitution is wonderful.” Engel’s work, written after Karl Marx’s death, had for its starting point Marx’ notes on Morgan’s treatise Ancient Society (1877).

All of these European writers thought that Iroquois society was an intermediate stage in a progression subject to certain laws of sociology, a progression toward a society and way of life like their own. Of course, Marx and Engels did not think things would stop there.

At the end of the French and Indian War, the British prevented the colonists, who numbered around 2 million at this point, from pushing west over the Appalachians and Alleghenies with the Royal Proclamation of 1763; indeed, his Majesty George III proclaimed (using the royal “our”) that this interdiction was to apply “for the present, and until our further pleasure be known.” This proclamation was designed to pacify French settlers and traders in the area and to keep the peace with Native American tribes, in particular the Iroquois Confederation who, unlike nearly all other tribes, did side with the British in the French and Indian War. It particularly infuriated land investors such as Patrick Henry and George Washington – the latter, a surveyor by trade, founded the Mississippi Land Company in 1763 just before the Proclamation with the expectation of profits from investments in the Ohio River Valley, an expectation dashed by the Proclamation. Designs by Virginians on this region were not surprising given Virginia’s purported westward reach at that time (for a map, click HERE); even today, the Ohio River is the Western boundary of West Virginia. Washington recovered financially and at the time of his death was a very wealthy man.

Though the Royal Proclamation was flouted by colonists who continued to migrate west, it was the first in the series of proclamations and acts that finally outraged the colonists to the point of armed rebellion. Interestingly, in Canada the Royal Proclamation still forms the legal basis for the land rights of indigenous peoples. Doubtless, this has worked out better for both parties than the Discovery Doctrine which has been in force in the U.S. since independence – for details. click HERE .

The Proclamation was soon followed by the hated Stamp Act, which was a tax directly levied by the British government on colonists, as opposed to a tax coming from the governing body of a colony. This led to the Stamp Act Congress which was held in New York City in October 1765. It was attended by representatives from 9 colonies and famously published its Declaration of Rights and Grievances which included the key point “There should be no taxation without representation.” This rallying cry of the Americans goes back to the English Bill of Rights of 1689 which asserts that taxes can only be enacted by elected representatives: “levying taxes without grant of Parliament is illegal.”

Rumblings of discontent continued as the British Parliament and King continued to alienate the colonists.

In the spring of 1774, Parliament abrogated the Massachusetts Charter of 1691, which gave people a considerable say in their government. In September, things boiled over not in Boston, but in the humble town of Worcester. Militiamen took over the courts and in October at Town Meeting independence from Britain was declared. The Committees of Correspondence assumed authority. (For a most useful guide to the pronunciation of “Worcester” and other Massachussets place names, click HERE. By the way, the Worcester Art Museum is outstanding.)

From there, things moved quickly and not so quickly. While the push for independence was well advanced in Massachusetts, the delegates to the First Continental Congress in the fall of 1774 were not prepared to take that bold step: in a letter John Adams wrote “Absolute Independency … Startle[s] People here.”  Most delegates attending the Philadelphia gathering, he warned, were horrified by “The Proposal of Setting up a new Form of Government of our own.”

But acts of insurrection continued. For example, in December 1774 in New Hampshire, activists raided Fort William and Mary, seizing powder and weaponry. Things escalated leading to an outright battle at Lexington and Concord in April, 1775.

Two years after the First Congress, at the Second Continental Congress, the delegates were ready to move in the direction of independence, convening in Philadelphia on May 10, 1776. In parallel, on June 12, 1776 at Williamsburg, the Virginia Constitutional Convention adopted the Virginia Declaration of Rights which called for a break from the Crown and which famously begins with

Section 1. That all men are by nature equally free and independent and have certain inherent rights, of which, when they enter into a state of society, they cannot, by any compact, deprive or divest their posterity; namely, the enjoyment of life and liberty, with the means of acquiring and possessing property, and pursuing and obtaining happiness and safety.

This document was authored principally by George Mason, a planter and friend of George Washington. Meanwhile back in Philadelphia, Thomas Jefferson (who would have been familiar with the Virginia Declaration) was charged by a committee with the task of putting together a statement presenting the views of the Second Continental Congress on the need for independence from the British. After some edits by Franklin and others, the committee brought forth the founding American document, The Declaration of Independence. The 2nd paragraph of which begins with that resounding universalist sentence:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.

And it continues

Governments are instituted among Men, deriving their just powers from the consent of the governed

The ideas expressed in the Virginia Declaration of Rights and the Declaration of Independence are radical and they had to have legitimacy. So, we are back to the original mystère, where did that legitimacy come from?

Clearly, phrases like “all men are by nature equally free and independent” and “all men are created equal” reverberate with an Iroquois, New World sensibility. Scholars also see here the hand of John Locke, most literally in the use of Locke’s  phrase “the pursuit of Happiness”; they also see the influence of Locke in the phrase “consent of the governed” – the idea of popular sovereignty being a concept closely associated with social contract philosophers of the European Enlightenment such as Locke and Rousseau.

Others also see the influence of the Scottish Enlightenment, that intellectual flowering with giants like David Hume and Adam Smith. The thinkers whose work most directly influenced the Americans include Thomas Reid (“self-evident” is drawn from the writings of this founder of the Common Sense School of Scottish Philosophy) and Francis Hutchenson (“unalienable rights” is drawn from his work on natural rights – Hutchenson is also known for his impact on his student Adam Smith, who later held Hutchenson‘s Chair of Moral Philosophy at Glasgow). Still others hear echoes of Thomas Paine and the recently published Common Sense – that most vocal voice for independence.

Jefferson would have been familiar with the Scottish Enlightenment; he also would have read the work of Locke and, of course, Paine’s pamphlet. The same most probably applies to George Mason as well. In any case, the Declaration of Independence went through multiple drafts, was read and edited by others of the Second Continental Congress and eventually was approved by the Congress by vote; so it must also have reflected a generally shared sense of political justice.

On July 1, 1776, the Second Continental Congress did take that bold step and voted in favor of Richard Henry Lee’s motion for independence from Great Britain. On July 4th, they officially adopted the Declaration of Independence.

Having declared independence from the British Crown, the representatives at the Second Continental Congress now had to come up with a scheme to unify a collection of fiercely independent political entities. Where could they have turned for inspiration and example in a political landscape dominated by powerful monarchies? Mystère. More to come.

Champagne

The Rule of St Benedict goes back to the beginning of the Dark Ages. Born in the north of Italy at the end of the 5th century, at the end of the Roman Empire, Benedict is known as the Father of Western Monasticism; he laid down a social system for male religious communities that was the cornerstone of monastic life in Western Europe for a thousand years and one that endures to this day. It was a way of life built around prayer, manual work, reading and, of course, submission to authority. The Benedictine abbey was led by an abbot; under him were the priests who were the ones to copy manuscripts and to sing Gregorian chant; then there were the brothers who bore the brunt of much of the physical work and lay people who bore as much or more. The Rule of St. Benedict is followed not only by the Benedictines, the order he founded, but also by the Cistercians, the Trappists and others.

The monasteries are credited with preserving Western Civilization during the Dark Ages; copying manuscripts led to the marvelous decorative calligraphy of the Book of Kells and other masterpieces – the monks even introduced the symbol @ (the arobase, aka the “at sign”) to abbreviate the Latin preposition ad. They are also credited with sending out those fearless missionaries who brought literacy and Christianity to pagan Northern tribes, bringing new people into the orbit of Rome: among them were Winfrid (aka Boniface) from Wessex who chopped down sacred oak trees to prosyletize German tribes and Willibrord from Northumbria who braved the North Sea to convert the fearsome Frisians, destroying pagan sanctuaries and temples in the process.

The monasteries also accumulated vast land holdings (sometimes as donations from aging aristocrats who were more concerned to make a deal with God than they were for the future of their children and heirs).  With land and discipline came wealth and monasteries became the target of choice for Viking marauders. At the time of the Protestant Reformation, the monasteries fell victim to iconoclasts and plundering potentates. Henry VIII dissolved the Cistercian abbey at Rievaulx in Yorkshire and other sites, procuring jewelry for Anne Boleyn in the process. This was the kind of thing that provoked the crypto-Catholic poet Shakespeare to lament about the

Bared ruin’d choirs, where late the sweet birds sang

Even today, though in ruins, Rievaulx is magnificent as a visit to Yorkshire or a click HERE will confirm. It can also be noted that the monks at Rievaulx abandoned the Rule of St. Benedict in the 15th century and became rather materialistic; this probably made them all the more tempting a target for Henry VIII.

We also owe the great beers to the monks. Today, two of the most sought after Belgian beers in the world are Trappiste and Abbaye de Leffe. The city of Munich derives its name from the Benedictine monastery that stood in the center of town. Today’s Paulaner and Augustiner beers trace back to monasteries. But what was it that turned other-wordly monks into master brewers? Mystère.

The likely story is that it was the Lenten fast that drove the monks to secure a form of nourishment acceptable to the prying papal legates. Theirs was a liquid only fast from Ash Wednesday through Holy Saturday, including Sundays. The beers they crafted passed muster as non-solid food and were rich in nutrients. There are also stories that the strong brews would be considered undrinkable by Italian emissaries from Rome and so would be authorized to be drunk during Lent as additional mortification of the flesh.

Indeed, the followers of the Rule of St. Benedict didn’t stop there. We also owe bubbly to the Benedictines. The first documented sparkling wine was made in 1531 at the Benedictine Abbey of St. Hilaire in the town of Limoux in the south of France – close to the Mediterranean, between Carcassonne and Perpignan – a mere 12 miles from Rennes-Le-Chateau (of Davinci Code fame). Though wine-making went back centuries and occasionally wine would have a certain effervescence, these churchmen discovered something new. What was this secret? Mystère.

These Benedictine monks were the first to come upon the key idea for bubbly – a second fermentation in the bottle. Their approach is what now called the “ancestral method” – first making a white wine in a vat or barrel (“cuve” in French), interrupting the vinification process, putting it in bottles and adding yeast, grape sugar or some alcohol, corking it and letting it go through a second fermentation process in the (hopefully strong and well-sealed) bottle. This is the technique used for the traditional Blanquette de Limoux; it is also used for the Clairette de Die. The classic Blanquette de Limoux was not strong in alcohol, around 6-8 % and it was rather doux and not brut. Today’s product comes in brut and doux versions and is 12.5%  alcohol.

By the way, St. Hilaire himself was not a monk but rather a 4th century bishop and defender of the orthodoxy of the Nicene Creed (the one intoned in the Catholic and Anglican masses and other services); he is known as the “Athanasius of the West” which puts him in the company of a man with a creed all of his own – the Athanasian Creed forcefully affirms the doctrine of the Triune God and is read on Trinity Sunday.

The original ancestral method from Limoux made for a pleasant quaff, but not for the bubbly of today. Did the Benedictines come to the rescue once again? What devilish tricks did they come up with next? Or was it done by some other actors entirely? Mystère.

This time the breakthrough to modern bubbly took place in the Champagne region of France. This region is at the point where Northern and Southern Europe meet. In the late middle ages, it was one of the more prosperous places in Europe, a commercial crossroads with important fairs at cities like Troyes and Rheims. The cathedral at Rheims is one of the most stunning in Europe and the French kings were crowned there from Louis the Pious in 816 to Charles X in 1825. In fact, Rheims has been a city known to the English speaking world for so long that its name in French (Reims) has diverged from its older spelling which we still use in English. It is also where English Catholics in the Elizabethan and Jacobean periods published the Douay-Rheims translation of the Latin Vulgate.

Enter Pierre Pėrignon, son of a prosperous bourgeois family, who in 1668 joined the Benedictine monastery at St. Pierre de Hautvillers. The order by that time deigned to accept non-noble commoners as monks and would dub them dom, a title drawn from the Latin word for lord, dominus, to make up for their lack of a traditional aristocratic handle.

Dom Pėrignon was put in charge of wine making and wine storage, both critical to the survival of the monastery which had fallen on hard times with only a few monks left and things in a sorry state. The way the story is told in France, it was during a pilgrimage south and a stay with fellow Benedictines at the monastery of St. Hilaire in Limoux that he learned the ancestral method of making sparkling wine. However he learned of their techniques, he spent the rest of his life developing and perfecting the idea. By the time of his death in 1715, champagne had become the preferred wine at the court of Louis XIV and the wine of choice of the fashionable rich in London. Technically, Dom Pėrignon was a master at choosing the right grapes to blend to make the initial wine; then he abandoned the ancestral technique of interrupting the fermentation of the wine in the cuve or vat and let the wine complete its fermentation; next, to deal with the problem of dregs caused by the second fermentation in the bottle, he developed the elaborate practice of rotating each bottle by a quarter turn, a step repeated every day for two or more months and known as the remuage (click HERE for an illustration); it is said that professionals can do 40,000 bottles a day.

To all that, one must add that he found a solution to the “exploding bottle problem”; as the pressure of the CO2 that creates the bubbles builds up during the fermentation in the bottle, bottles can explode spontaneously and even set off a chain reaction. To deal with this, Dom Pėrignon turned to bottle makers in London who could make bottles that could withstand the build-up of all that pressure. Also, that indentation in the bottom of the bottle (the punt or kick-up in English, cul in French) was modified; better corks from Portugal too entered into it.

Putting all this together yielded the champagne method. Naturally, wine makers in other parts of France have applied this process to their own wines, making for some excellent bubbly; these wines used to carry the label “Mėthode Champenoise” or “Champagne Method.” While protection of the right to the term Champagne itself is even included in the Treaty of Versailles, more recently, a restriction was made to the effect that only wines from the Champagne region could even be labeled “Champagne Method.” So the other wines produced this way are now called crėmants (a generic term for sparkling wine). Thus we have the Crėmant d’Alsace, the Crėmant de Bourgogne, the Crėmant de Touraine and even the Crėmant de Limoux. All in all, these crėmants constitute the best value in French wines on the market. Other countries have followed the French lead and do not use the label “Champagne” or “Champagne Method” for sparkling wines; even the USA now follows the international protocol (although wines labeled Champagne prior to 2006 are exempt).

Admittedly, champagne is a marvelous beverage. It has great range and goes well with steak and with oysters. The sound of the pop of a champagne cork means that the party is about to begin.  Of course, one key thing is that champagne provides a very nice “high”; and it does that without resorting to high levels of alcoholic content. Drolly, at wine tastings and similar events, the quality of the “high” provided by the wine is never discussed directly – instead they skirt around it talking about legs and color and character and what not, while the key thing is the quality of the “high” and the percentage of alcohol in the wine. So how do you discuss this point in French itself? Mystère. In effect, there is no way to deal with all this in French, la Langue de Voltaire. The closest you can come to “high” is “ivresse” but that has a negative connotation; you might force the situation and try something like “douce ivresse” but that doesn’t work either. Tis a mystère without a solution, then. But it is the main difference between a $60 bottle of Burgundy and a $15 bottle of Pinot Noir – do the experiment.

There are some revisionist historians who try to diminish the importance of Dom Pėrignon in all this. But he has been elevated to star status as an icon for the Champagne industry. As an example, click HERE for a statue in his honor erected by Moȅt-Chandon; click HERE for an example of an advertisement they actually use to further sales.

So the secret to the ancestral method and the champagne method is that second fermentation in the bottle. But now we have popular alternatives to champagnes and crėmants in the form of Prosecco and Cava. If these very sparkling wines are not made with the champagne method, what is it then? Mystère.

In fact, the method used for these effervescent wines does not require that second fermentation in the bottle, a simplification made possible by the invention of stainless steel toward the end of the 19th century. This newer method carries out the secondary fermentation in closed stainless steel tanks that are kept under pressure. This simplifies the entire production process and makes the bubbly much less expensive to make. The process was first invented by Frederico Martinotti in Italy, then improved by Eugène Charmat in France – all this in the 1890’s. So it is known as the metodo italiano in Italy and the Charmat method most elsewhere. In the 1960’s things were improved further to allow for less doux, more brut wines. They are now extremely popular and considered a good value by the general wine-loving public.

Finally, there is the simplest method of all to make sparkling wine or cider – inject carbon dioxide directly into the finished wine or cider, much like making seltzer water from plain water with a SodaStream device. In fact, this is the method used for commercial ciders. Please don’t try this at home with a California chardonnay if you don’t have strong bottles, corks and protective wear.

Indeed, beer and wine have played an important role in Western Civilization; wine is central to rituals of Judaism and Christianity; the ancient Greeks and Romans even had a god of wine. In fact, beer and wine go back to the earliest stages of civilization. When hunter gatherers metamorphosed into farmers during the Neolithic Revolution and began civilization as we know it, they traded a life-style where they were taller, healthier and longer-lived for one in which they were shorter, less healthy, had a shorter life span and had to endure the hierarchy of a social structure with priests, nobles, chiefs and later kings.  On the other hand, archaeological evidence often points to the fact that one of the first things these farmers did do was to make beer by fermenting grains. Though the PhD thesis hasn’t been written yet, it is perhaps not unsafe to conclude that fermentation made this transition bearable and might even have been the start of it.

 

 

 

North America III

When conditions allowed, humans migrated across the Bering Land Bridge moving from Eurasia to North America thousands of years before the voyages of discovery of Columbus and other European navigators. That raises the question whether there were Native Americans who encountered Europeans before Columbus. If so, were these encounters of the first, second or third kind? Mystère.
For movement west by Europeans before Columbus, first we have the Irish legend of the voyage of St. Brendan the Navigator. Brendan and his 16 mates (St. Malo among them) sailed in a currach – an Irish fisherman’s boat with a wooden frame over which are stretched animal skins (nowadays they use canvas and sometimes anglicize the name to curragh). These seafarers reportedly reached lands far to the West, even Iceland and Newfoundland in the years 512-530 A.D. All this was presented as fact by nuns in parochial schools of yore, but, begorrah, there is archaeological evidence of the presence of Irish visitors on Iceland before any Viking settlements there. Moreover, in the 1970’s the voyage of St. Brendan was reproduced by the adventurer Tim Severin and his crew, which led to a best-selling book The Brendan Voyage, lending further credence to the nuns’ version of history. However, there is no account in the legends of any contact with new people; for contact with a mermaid, click HERE.
In the late 9th century, Viking men accompanied by (not so willing) Celtic women reached Iceland and established a settlement there (conventionally dated as of 874 A.D.). Out of Iceland came the Vinland Saga of the adventures of Leif Erikson (who converted to Christianity and so gained favor with the later saga compilers) and of his fearless sister Freydis Eriksdottir (who also led voyages to Vinland but who stayed true to her pagan roots). It has been established that the Norse did indeed reach Newfoundland and did indeed start to found a colony there; the site at L’Anse-aux-Meadows has yielded abundant archaeological evidence of this – all taking place around 1000 A.D.  The Norse of the sagas called the indigenous people they encountered in Vinland the Skraeling (which can be translated as “wretched people”). These people were not easily intimidated; there were skirmishes and more between the Skraeling (who did have bows and arrows) and the Norse (who did not have guns). In one episode, Freydis grabs a fallen Viking’s sword and drives the Skraeling attackers off on her own – click HERE for a portrait of Freydis holding the sword to her breast.
Generally speaking, the war-like nature of the Skraeling is credited with keeping the Vikings from establishing a permanent beach head in North America. So these were the first Native Americans to encounter Europeans, solving one mystery. But exactly which Native American group were the Skraeling? What is their migration story? Again mystère.
The proto-Inuit or Thule people, ancestors of the modern Inuit, emerged in Alaska around 1000 B.C. Led by their Sled Dogs, they “quickly” made their way east across Arctic Canada, then down into Labrador and Newfoundland. The proto-Inuit even made their way across the Baffin Bay to Greenland. What evidence there is supports the idea that these people were the fierce Skraeling of the sagas.
As part of the movement West, again according to the sagas, Norse settlers came to Greenland around 1000 A.D. led by the notorious Erik the Red, father both of Leif and Freydis. Settlers from Norway as well as Iceland joined the Greenland colony and it became a full blown member of medieval Christendom, what with churches, a monastery, a convent and a bishop. In relatively modern times, around 1200 A.D., the proto-Inuit reached far enough South in Greenland to encounter the Europeans there. Score one more for these intrepid people. These are the only two pre-Columbian encounters between Europeans and Native Americans that are well established.
In the end, though, the climate change of the Little Ice Age (which began around 1300 A.D.) and the Europeans’ impact on the environment proved too much and the Greenland colony died out sometime in the 1400’s. The proto-Inuit population with their sled dogs stayed the course (though not without difficulty) and survived. As a further example of the impact of the climate on Western Civilization, the Little Ice Age practically put an end to wine making in the British Isles; the crafty and thirsty Scots then made alcohol from grains such as barley and Scotch whisky was born.
The success of the proto-Inuit in the Arctic regions was based largely on their skill with the magnificent sled dogs that their ancestors had brought with them from Asia. The same can be said for the Aleut people and their Alaskan Malamute Dog. Both these wolf-like marvels make one want to read or reread The Call of the Wild; for a Malamute picture, click HERE.
We know the Clovis people and other populations in the U.S. and Mexico also had dogs that had come from Siberia. But today in the U.S. we are overwhelmed with dogs of European origin – the English Sheepdog, the Portuguese Water Dog, the Scottish Terrier, the Irish Setter, the French Poodle, the Cocker Spaniel, and on and on. What happened to the other dogs of the Native Americans? Are they still around? Mystère.
The simple answer is that, for the most part, in the region south of the Arctic, the native dogs were simply replaced by the European dogs. However, for the lower 48, it has been established recently that the Carolina Dog, a free range dingo, is Asian in origin. In Mexico, the Chihuahua is the only surviving Asian breed; the Xoloitzcuintli (aka the Mexican Hairless Dog) is thought to be a hybrid of an original Asian dog and a European breed.  (Some additional Asian breeds have survived in South America.)
Still it is surprising that the native North Americans south of the Arctic regions switched over so quickly and so completely to the new dogs. A first question that comes to mind is whether or not these two kinds of dogs were species of the same animal; but this question can’t be answered since dogs and wolves are already hard to distinguish – they can still interbreed and their DNA split only goes back at most 30,000 years. A more tractable formulation would be whether or not the Asian dogs and the European dogs are the issue of the same domestication event or of different domestication events. However, “it’s complicated.” One sure thing we know comes from the marvelous cave at Chauvet Pont d’Arc in central France where footprints of a young child walking side by side with a dog or wolf have been found which date back some 26,000 years. This would point to a European domestication event; however, genetic evidence tends to support an Asian origin for the dog. Yet another theory backed by logic and some evidence is that both events took place but that subsequently the Asian dogs replaced the original European dogs in Western Eurasia.
For one thing, the dogs that came with the Native Americans from Siberia were much closer to the dogs of an original Asian self-domestication that took place in China or Mongolia (according to PBS); in any case, they would not have gone through as intense and specialized a breeding process as would dogs in populous Europe, a process that made the European dogs more useful to humans south of the Arctic and more compatible with domestic animals such as chickens, horses and sheep – until the arrival of Europeans the Native Americans did not have domestic animals and did not have resistance to the poxes associated with them.
The role of dogs in the lives of people grows ever more important and dogs continue to take on new work roles – service dogs, search and rescue dogs, guide dogs, etc. People with dogs reportedly get more exercise, get real emotional support from their pets and live longer. And North America has given the world the wonderful Labrador Retriever. The “Lab” is a descendant of the St John’s Water Dog which was bred in the Province of New Foundland and Labrador; this is the only Canadian province to bear a Portuguese name – most probably for the explorer João Fernandes Lavrador who claimed the area for the King of Portugal in 1499, an area purportedly already well known to Portuguese, Basque and other fearless fishermen who were trolling the Grand Banks before Columbus (but we have no evidence of encounters of such fishermen with the Native Americans). At one point later in its development, the Lab was bred in Scotland by the Duke of Buccleuch, whose wife the Duchess was Mistress of the Robes for Queen Victoria (played by Diana Rigg in the PBS series Victoria – for the actual duchess, click HERE). From Labrador and Newfoundland, the Lab has spread all over North America and is now the most popular dog breed in the U.S. and Canada – and in the U.K. as well.