Voting Arithmetic

Voting is simple when there are only two candidates and lots of voters: people vote and one of the candidates will simply get more votes than the other (except in the very unusual case of a tie when a coin is needed). In other words, the candidate who gets a majority of the votes wins; naturally, this is called majority voting. Things start to get complicated when there are more than two candidates and a majority of the votes is required to elect a candidate; in this case, multiple ballots and lots of horse trading can be required until a winner emerges.

The U.S. has continued many practices inherited from England such as the system of common law. One of the most important of these practices is the way elections are run. In the U.K. and the U.S., elections are decided (with some exceptions) by plurality : the candidate who polls the largest number of votes is the winner. Thus if there are 3 candidates and one gets 40% of the votes while the other two get 30%, the one with 40% wins – there is no runoff. This is called plurality voting in the U.S. and relative majority voting in the U.K.

The advantages of plurality voting are that it is easy to tabulate and that it avoids multiple ballots. The glaring disadvantage is that it doesn’t provide for small party representation and leads to two party dominance over the political system – a vote for a third party candidate is a vote thrown away and counts for nothing. As a random example, look at the presidential election in the Great State of Florida in 2000: everything would have been the same if those who voted for Ralph Nader had simply stayed home and not voted at all. As a result, third parties never get very far in the U.S. and if they do pick up some momentum, it is quickly dissipated. In England, it is similar with Labour and the Conservatives being the dominant parties since the 1920’s. Is this handled differently elsewhere? Mystère.

To address this problem of third party representation, countries like Italy and Holland use proportional voting for parliamentary elections. For example, suppose there are 100 seats in parliament, that the 3 parties A, B and C propose lists of candidates and that party A gets 40% of the votes cast and B and C each get 30%; then A is awarded 40 seats in parliament while B and C each get 30 seats. With this system, more than two parties will be represented in parliament; if no one party has a majority of the seats, then a coalition government will be formed.

Another technique used abroad is to have majority voting with a single runoff election. Suppose there are 4 parties (A,B,C,D) with candidates for president; a first election is held and the two top vote getters (say, A and B) face off in a runoff a week or two later. During this time, C and D can enter into a coalition with A or B and their voters will now vote for the coalition partner. So those minority votes in the first round live to fight another day. Also that president, once elected, now represents a coalition which mitigates the kind of extreme Manichaeism of today’s U.S. system. It can lead to some strange bedfellows, though. In the 2002 presidential election in France, to stop the far-right National Front candidate Jean-Marie LePen from winning the second round, the left wing parties had to back their long time right wing nemesis Jacques Chirac.
However, one might criticize and find fault with these countries and their election systems, the fact is that voter participation is far higher than that in the U.S.

For U.S. presidential elections, what with the Electoral College and all that, “it’s complicated.” For Hamilton, Madison and others, the Electoral College would serve as an additional buffer between the masses and the government: one way this was to be achieved was by means of the “faithless elector,” one who does not vote for the candidate he pledged to – this stratagem would overturn a mass vote for a potential despot. This was considered a feature and not a bug; this feature is still in force and some pledged electors do employ it – in the 2016 election, seven electors voted against their pledged candidates, two against Trump and five against Clinton. But, except for faithless electors, how else could the Electoral College stymie the will of the people? Mystère.

That the Electoral College can indeed serve as a buffer between the presidency and the population has been proven by four elections (1876, 1888, 2000, 2016) where the Democratic candidate carried the popular vote but the Republican candidate obtained a majority in the Electoral College; most scandalously, in the 1876 election, in a backroom deal, 20 disputed electoral votes were awarded to the Republican candidate Rutherford B. Hayes to give him a majority of 1 vote in exchange for the end of Reconstruction in the South – “probably the low point in our republic’s history” to cite Gore Vidal.

That the Electoral College can indeed serve as a buffer between the presidency and the population has also been proven by the elections of 1800 and 1824 where no candidate had a majority of the electoral vote; in this case, the Constitution specifies that the election is to be decided in the House of Representatives with each state having one vote. In 1824, the populist candidate, Andrew Jackson, won a plurality both of the popular vote and the electoral vote, but on the first ballot a majority of the state delegations, cajoled by Henry Clay, voted for the establishment candidate John Quincy Adams. In the election of 1800, Jefferson and Burr were the top electoral vote getters with 73 votes each. Jefferson won a majority in the House on the 36th ballot, his victory engineered by Hamilton who disliked Jefferson but loathed Burr – we know how this story will end, unfortunately.

For conspiracy theorists, it is worth pointing out that not only were all four candidates who won the popular vote but lost the electoral vote Democrats but that three of the four were from New York State as was Aaron Burr.

The most obvious shortcoming of the Electoral College system is that it is a form of gerrymandering that gives too much power and representation to rural states at the expense of large urban states; in English terms, it creates “rotten boroughs.” For example, using 2018 figures, California has 55 electoral votes for 39,776,830 people and Wyoming has 3 votes for 573,720; so, if one does the math, 1 vote for president in Wyoming is worth 3.78 votes in California. Backing up, let us “show our work.” When we solved this kind of problem in elementary school, we used the rule, the “product of the means is equal to the product of the extremes”; thus, using the camel case dear to programmers, we start with the proportion

     votesWyHas : itsPopulation :: votesCaShouldHave : itsPopulation

where : is read “is to” and “::” is read “as.” Three of the four terms have known values and so the proportion becomes

     3 : 573,720 :: votesCaShouldHave : 39,776,830

The above rule says that the product of the inner two terms is equal to that of the outer two terms. The term votesCaShouldHave is the unknown so let us call it ; let us apply the rule using * as the symbol for multiplication and let us solve the following equation:

     3 * 39,776,830 = 573,720 * x

which yields the number of electors California would have, were it to have as many electors per person as Wyoming does; this simplifies to

     x = (3 * 39,776,830)/ 573,720 = 207.99

So California would have to have 207.99 electors to make things fair; dividing this figure by 55, we find that 1 vote in Wyoming is worth 207.99/55 = 3.78 votes in California. This is a most undemocratic formula for electing the President. But things can get worse. If the race is thrown into the House, since California has 69.33 times as many people as Wyoming, the ratio jumps to 1.0 to 69.33, making this the most undemocratic way of selecting a President imaginable. For state by state population figures, click  HERE .

One simple way to mitigate the undemocratic nature of the Electoral College would be to eliminate the 2 electoral votes that correspond to each state’s 2 senators. The would change the Wyoming/California ratio and make 1 vote in the Equality State worth only 1.26 votes in the Golden State. With this counting technique, Trump would still have won the 2016 presidential election 236 to 195 (much less of a “massive landslide” than 304 to 227, the official tally) but Al Gore would have won the 2000 race, 228 to 209, even without Florida (as opposed to losing 266 to 271).

To tally the Electoral College vote, most states assign all their votes (via the “faithful” pledged electors) to the plurality winner for president in that state’s presidential tally. Nebraska and Maine, the two exceptions, use the congressional district method which assigns the two votes that correspond to Senate seats to the overall plurality winner and one electoral vote to the plurality winner in each congressional district in the state. By way of example, in an election with 3 candidates, suppose a state has 3 representatives (so 5 electoral votes) and that one candidate obtains 50% of the total vote and the other two 25% each; then if each candidate is the plurality winner in the vote from exactly one congressional district, the top vote-getter is assigned the 2 votes for the state’s senators plus 1 vote for the congressional district he or she has won and the other two candidates receive 1 electoral vote each. This system yields a more representative result but note that gerrymandering will still impact who the winner is in each congressional district. What is intrinsically dangerous about this practice, though, is that, if candidates for more than two parties are running, it can dramatically increase the chances that the presidential election will be thrown into the House of Representatives. In this situation, the Twelfth Amendment ordains that the 3 top electoral vote-getters must be considered for the presidency and so, if this method had been employed generally in the past, the elections of 1860 (Lincoln), 1912 (Wilson) and 1992 (Clinton) could well have given us presidents Stephen Douglas, Teddy Roosevelt and Ross Perot.

The congressional district method is a form of proportional voting. While it could be disastrous in the framework of the Electoral College system for electing a U.S. president, proportional voting itself is successfully implemented in many countries to achieve more equitable outcomes than that furnished by plurality voting and the two party system.

A voting system which is used in many American cities such as Minneapolis, and Oakland and in countries such as Australia and Ireland is known as ranked choice voting or instant-runoff voting. Voters in Maine recently voted for this system to be used in races for seats in the U.S. House of Representatives and in party primaries. Ranked choice voting emulates runoff elections but in a single round of balloting; it is a much more even-handed way to choose a winner than plurality voting. Suppose there are 3 candidates – A, B and C; then, on the ballot, each voter lists the 3 candidates in the order of that voter’s preference. For the first round, the count is made of the number of first place votes each candidate received; if for one candidate that number is a majority, that candidate wins outright. Otherwise, the candidate with the least number of first place votes, say A, is eliminated; now we go into the “second round” with only B and C as candidates and we add to B’s first place total the number of ballots for A that listed B as second choice and similarly for C. Now, except in the case of a tie, either B or C will have a clear majority and will be declared the winner. This will give the same result that staging a runoff between B and C would yield. With 3 candidates, at most 2 rounds are required; if there were 4 candidates, up to 3 rounds could be needed, etc.

Most interestingly, in the Maine 2018 election, in one congressional district, no candidate for the House of Representatives gathered an absolute majority on the first round but a candidate who received fewer first place votes on that first round won on the second round when he caught up and surged ahead because of the number of voters who made him their second choice. (For details, click  HERE ).

Naturally, all this is being challenged in court by the losing side. However, for elections, Section 4 of Article 1 of the U.S. Constitution leaves implementation to the states for them to carry out in the manner they deem fit – subject to Congressional oversight but not to judiciary oversight:

   “The Times, Places and Manner of holding Elections for Senators and Representatives, shall be prescribed in each State by the Legislature thereof; but the Congress may at any time by Law make or alter such Regulations, except as to the Places of chusing (sic) Senators.”

N.B. In this article of the Constitution, the senators are an exception because at that time the senators were chosen by the state legislatures and direct election of senators by popular vote had to wait for 1913 and the 17th Amendment.

At the first legal challenge to it, the new Maine system was upheld vigorously in the United States District Court based in large part on Section 4 of Article 1 above. For the ruling itself, click  HERE . But the story will not likely end so simply.

This kind of voting system is also used by the Academy of Motion Picture Arts and Sciences to select the nominees in each category, but they call it preferential voting. So to determine five directors, they apply the elimination process until only five candidates remain.

With ranked choice voting, in Florida, in that 2000 election, if the Nader voters listed Ralph Nader first, Al Gore (who was strong on the environment) second and George Bush third and if all Pat Buchanan voters listed Buchanan first, Bush second and Gore third, Gore would have carried the day by over 79,000 votes in the third and final round.

However one might criticize and find fault with countries like Australia and Ireland and their election systems, the fact is that voter participation is far higher than that in the U.S. For numbers, click  HERE .

Ranked voting systems are not new and have been a topic of interest to social scientists and mathematicians for a long time now. The French Enlightenment thinker, the Marquis de Condorcet, introduced the notion of the Condorcet Winner of an election – the candidate who would beat all the other candidates in a head-to-head election based on the ballot rankings; he also is the author of Condorcet’s Paradox – that a ranked choice setup might not produce a Condorcet winner. To analyze this situation, the English mathematician Charles Lutwidge Dodgson introduced the Dodgson Method, an algorithm for measuring how far the result for a given election using ranked choice voting is from producing a Condorcet Winner. More recently, the mathematician and economist Kenneth Arrow authored Arrow’s Paradox which shows that there are ways in which ranked voting can sometimes be gamed by using the idea behind Condorcet’s Paradox: for example, it is possible that in certain situations, voters can assure the victory of their most preferred candidate by listing that candidate 2nd and not 1st – the trick is to knock out an opponent one’s favorite would lose to in a head to head election by favoring a weaker opponent who will knock out the feared candidate and who will then be defeated in the final head to head election. For his efforts, Arrow was awarded a Nobel Prize; for his efforts, Condorcet had a street named for him in Paris (click  HERE  ); for his efforts, Charles Lutwidge Dogson had to latinize his first and middle names, then reverse them to form the pen name Lewis Carroll, and then proceed to write Alice in Wonderland and Jabberwocky, all to rescue himself from the obscurity that usually awaits mathematicians. For a detailed but playful presentation on paradoxes and ranked choice voting, click  HERE .

The Constitution – U.S Scripture III

In 1787, the Confederation Congress called for a Constitutional Convention with the goal of replacing the Articles of Confederation with a form of government that had the central power necessary to lead the states and the territories.
This had to be a document very different from the Iroquois Great Law of Peace, from the Union of Utrecht and from the Articles of Confederation themselves. It had to provide for a centralized structure that would exercise legislative and executive power on behalf of all the states and territories. Were there existing historical precedents for a written document to set up a social contract of government? Mystère.
In antiquity, there was “The Athenian Constitution”; but this text, credited to Aristotle and his students at the Lyceum, is not a founding document; rather it is an after-the-fact compilation of the workings of the Athenian political system.
In the Middle Ages there was the Magna Carta of 1215 with its legal protections such as trial by jury and its limitations on the power of the king. Though the Magna Carta itself was quickly annulled by a bull of the crusade-loving Pope Innocent III as “illegal, unjust, harmful to royal rights and shameful to the English people,” it served as a template for insulating the citizen from the power of the state.
There is a book entitled “The English Constitution” but this was published by Walter Bagehot in the latter part of the 19th century and, like “The Athenian Constitution,” it is an account of existing practices and procedures rather than any kind of founding document. This is the book that the 13 year old Elizabeth is studying when taking history lessons with the provost of Eton in the TV series “The Crown.”
For an actual example of a nation’s constitution that pre-dates 1789, one has to go back to 1600, the year that the Constitution of San Marino was adopted. However, there is no evidence that the founding fathers knew anything of this at all. Since that time, this document has been the law of this land-locked micro-state and it has weathered many storms; most recently, during the Cold War, it gave San Marino its 15 minutes of fame when the citizens elected a government of Communist Party members and then peacefully voted them out of office twelve years later. For an image of St. Martinus, stonemason and founding father of this, the world’s smallest republic, click  HERE .
The English Bill of Rights of 1689, an Act of Parliament, is a constitutional document in that it transformed an absolute monarchy into a constitutional monarchy. This is the key role of a constitution – it tempers or replaces traditional monarchy based on the Divine Right of Kings with an explicit social contract. This sharing of power between the monarch and the parliament made England the first Constitutional Monarchy – in simple terms, the division of roles made the parliament the legislature and made the king the executive. To get a sense of how radical this development was, it took place only a few years after Louis XIV of France reportedly exclaimed “L’Etat, c’est moi.”
With independence brewing, the Continental Congress in May 1776 directed the colonies to draw up constitutions for their own governance. The immediate precursors to the U.S. Constitution then were the state constitutions of 1776, 1777 and 1780, born of the break with Great Britain.
An important influence on this generation of constitutional documents was the work of the French Enlightenment philosopher Montesquieu. In his Spirit of the Laws (1748) Montesquieu analyzed forms of government and how different forms matched with different kinds of nations – small nations best being republics, medium sized nations best served by a constitutional monarchy and very large ones best being empires. His analysis broke government down into executive, legislative and (to a lesser extent) judicial powers and he argued that, to avoid tyranny, these should be separate and independent of each other so that the power of any one of these would not exceed the combined power of the other two.
The state of Connecticut did not adopt a new constitution but continued with its (sometimes updated) Royal Charter of 1662. In the matter of religious freedom, in Connecticut, the Congregational Church was effectively the established state religion until the Constitution of 1818. Elsewhere, the antidisestablishmentarians generally lost out. For example, in the case of New York, Georgia, Rhode Island, Pennsylvania and Massachusetts, the state constitution guaranteed freedom of religion; in New Hampshire, the Constitution of 1776 was silent on the subject, but “freedom of conscience”  was guaranteed in the expanded version of 1784.  In Delaware’s case, it prohibited the installation of an established religion; in Virginia’s case, it took the Virginia Statute for Religious Freedom, written by Thomas Jefferson in 1786, to stave off the threat of an established church. On the other hand, the Maryland Constitution only guaranteed freedom of religion to “persons professing the Christian religion” (which was the same as in Maryland’s famous Toleration Act of 1649, the first step in the right direction in the colonies). In the 1776 document, Anglicanism is the established religion in South Carolina – this was undone in the 1778 revision. In the North Carolina Constitution, Article 32 affirms “That no person who shall deny the being of God, or the truth of the Protestant religion … shall be capable of holding any office … within this State”; New Jersey’s Constitution had a similar clause. It seems that from a legal point of view, a state still has the authority to have its own established or favored religion and attempts to move in this direction are still being made in North Carolina and elsewhere – the First Amendment explicitly only prohibits Congress from setting up an established religion for the country as a whole.
The challenges confronting the Constitutional Convention in Philadelphia in 1787 were many – to craft a system with a sufficiently strong central authority but not one that could morph into a dictatorship or mob rule, to preserve federalism and states’ rights (in particular, for the defense of the peculiar institution of slavery), to preserve popular sovereignty through a system of elections, etc. Who, then, rose to the occasion and provided the intellectual and political drive to get this done? Mystère.
Thomas Jefferson was the ambassador to France, John Adams was the ambassador to Great Britain and neither attended the Convention. Benjamin Franklin was one of the few who took a stand for the abolition of slavery but to no avail; Alexander Hamilton had but a bit part and his main initiative was roundly defeated. George Washington took part only at James Madison’s urging (but did serve as president of the Convention). But it is Madison who is known as the Father of the Constitution.
Madison and the others were keen readers of the Roman historian Publius Cornelius Tacitus who pitilessly described the transformation of the Roman Senatorial class from lawmakers into sniveling courtiers with the transformation of the Roman Republic into the Roman Empire; Montesquieu also wrote about the end of the Roman Republic. On the other hand, the rumblings leading up to the French Revolution could be heard and the threat of mob rule was not unrealistic. So the fear of creating a tyrannical regime was very much with them.
Madison’s plan for a strong government that would not turn autocratic was, like some of the state constitutions, based on the application of ideas of Montesquieu. In fact, in Federalist No. 47, Madison (using the Federalist pseudonym Publius) developed Montesquieu’s analysis of the separation of powers further and enunciated the principle of “checks and balances.”
For his part, Hamilton pushed for a very strong central government modeled on the English system with his British Plan; however, this plan was not adopted, nor were the plans for structuring the government proposed by Virginia and by New Jersey. Instead, a balance between large and small states was achieved by means of the Connecticut Compromise: there would be a bicameral legislature with an upper house, the Senate, having two senators from each state; there would be a lower house, the House of Representatives, with each state having a number of representatives proportional to its population. While the senators would be appointed by the state legislatures, the representatives would be chosen by popular vote (restricted to white men of property, of course).
This bicameral setup, with its upper house, was designed to reduce the threat of mob rule. However, it also brought up the problem of computing each state’s population for the purpose of determining representation in the House of Representatives. The resulting Three-Fifths Compromise stipulated that 3/5ths of the slave population in a state would count toward the state’s total population for this computation. This compromise created the need for an electoral college to elect the president, since enslaved African Americans would not each have three-fifths of a vote! So the system of electors was introduced and each state would have one elector for each member of Congress.
Far from abolishing slavery, Article 1, Section 9, Clause 1 of the Constitution prohibited Congress from making any law that would interfere with the international slave trade until 1808 at the earliest. However, Jefferson had stood for ending this traffic since the 1770’s and, in his second term in 1807, the Act Prohibiting the Importation of Slaves was passed and the ban was initiated the following year.  However, the ban was often violated and some importation of slaves continued into the Civil War. In 1807, the British set up a similar ban on the slave trade in the Empire and the British Navy actively enforced it against ships of all nations off the coast of West Africa and elsewhere, technically classifying slave traders as pirates; this was an important impediment to the importation of slaves into the United States.
For Hamilton, Madison and others, the Electoral College would serve as an additional buffer between the masses and the government: one way this was to be achieved was by means of the “faithless elector,” one who does not vote for the candidate he pledged to – this stratagem would overturn a mass vote for a potential despot. This was considered a feature and not a bug; this feature is still in force and some pledged electors do employ it – in the 2016 election, seven electors voted against their pledged candidates, two against Trump and five against Clinton.
The Constitution left it to the states to determine who is eligible to vote. With some exceptions here and there at different times, the result was that only white males who owned property were eligible to vote. This belief in the “divine right of the propertied” has its roots in the work of John Locke; it also can be traced back to a utopian composition published in 1656 by James Harrington; in The Commonwealth of Oceana he describes an egalitarian society with an ideal constitution where there is a limit on how much property a family can own and rules for distributing property; there is a senate and there are elections and term limits. Harrington promulgated the idea of a written constitution arguing that a well-designed, rational document would curtail dangerous conflicts of interest. This kind of interest in political systems was dangerous back then; Oliver Cromwell blocked publication of Harrington’s work until it was dedicated to the Lord Protector himself; with the Stuart Restoration, Harrington was jailed in the Tower of London and died soon after as a result of mistreatment. For a portrait, click HERE .
In any case, it wasn’t until 1856 that even universal suffrage for white males became established in the U.S. For the enfranchisement of rest of the population, it took the Civil War and constant militancy up to and during WWI. A uniform election day was not fixed until 1845 and there are no real federal guidelines for election standards. This issue is still very much with us, as demonstrated by a wave of voter suppression laws in the states newly released from the strictures of the Voting Rights Act by the Roberts’ Court with the 2013 decision in Shelby County v. Holder.
Finally, a four page document entitled Constitution of the United States of America was submitted to the states in September 1787 for ratification. This process required nine of the thirteen states; the first to ratify it was Delaware and the ninth was New Hampshire. There was no Bill of Rights and no provision for judicial review of legislation. Political parties were not expected to play a significant role and the provisions for the election of president and vice-president were so clumsy that they exacerbated the electoral crisis of 1800 which ultimately led to the duel between Aaron Burr and Alexander Hamilton.
The Confederation Congress declared the Constitution ratified in September 1788 and the first presidential election was held. Congress was seated and George Washington became President in the spring of 1789.
In American life, the Constitution has truly become unquestionable, sacred scripture and the word unconstitutional has the force of a curse. As a result, to a large extent, Americans are frozen in place and are not able to be forward looking in dealing with the myriad new kinds of problems, issues and opportunities that contemporary life creates.
For example, the Constitution provides for an Amendment process that requires ratification by 3/4ths of the states. When there were 13 states huddled together on the Eastern Seaboard, this worked fine and the first 10 amendments, The Bill of Rights, were passed quickly after the Constitution was adopted. However, today this process is most cumbersome. For example, any change in the Electoral College system would require an amendment to the Constitution; but any 13 states could block an attempt at change and the 13 smallest states, which have barely 4% of the population, would not find it in their interest to make any such change, alas. Another victim is term limits for members of Congress. It is in states’ interest to have senators and representatives with seniority so they can accede to powerful committee chairmanships etc; this is the old Dixiecrat strategy that kept Strom Thurmond in the Senate until he was over 100 years old – but then the root of the word senator is the Latin senex which means “old man.” The Constitution does provide for a second way for it to be amended: 34 state legislatures would have to pass applications for a constitutional convention to deal with, say, term limits; this method has never been used successfully, but a group “U.S Term Limits” is trying just that.
The idea of judicial review of laws passed by Congress did come up at the Convention. Madison first wanted there to be a set of judges to assist the president in deciding to veto a bill or not. In the end, nothing was set down clearly in the Constitution and the practice of having courts review constitutionality came about by a kind of judicial fiat when John Marshall’s Supreme Court ruled a section of an act of Congress to be unconstitutional. Today, any law passed by Congress and signed by the President has to go through an interminable process of review in the courts and in the end, the law means only what the courts say it means. Contrast this with the U.K. where the meaning of a law is what Parliament says it is. As a result with the Supreme Court politicized the way it is, law is actually made today by the Court and the Congress just licks its wounds. The most critical decisions are thus made by 5 unelected career lawyers. Already in 1921, seeing what was happening in the U.S., the French jurist Edouard Lambert coined the phrase “gouvernement des juges” for the way judges privileged their personal slant on cases before them to the detriment of a straightforward interpretation of the letter and spirit of the law.
The reduction of the role of Congress and the interference of the courts have also contributed to the emergence of an imperial presidency. The Constitution gives only Congress the right to levy tariffs or declare war; but now the president imposes tariffs, sends troops off to war, and governs mostly by executive order. Much of this is “justified” by a need for expediency and quick decision making in running such a complex country – but this is, as Montesquieu and others point out, the sort of thing that led to the end of the Roman Republic.

The Articles – U.S. Scripture II

The second text of U.S. scripture, the Articles of Confederation, gets much less attention than the Declaration of Independence or the Constitution. Still it set up the political structure by which the new country was run for its first thirteen years; it provided the military structure to win the war for independence; it furnished the diplomatic structure to secure French support during that war and then to reach an advantageous peace treaty with the British Empire.
The Iroquois Great Law of Peace and the Albany Plan of Benjamin Franklin that it inspired are certainly precursors of the Articles of Confederation.  In fact, the First Continental Congress set up a meeting in Albany NY in August 1775 with the Iroquois. The colonists informed the Iroquois of the possibility of the colonies’ breaking with Britain, acknowledged the debt they owed them for their example and advice, and presumably tried to test whether the Iroquois would support the British in the case of an American move for independence. In the end, the Iroquois did stay loyal to the British (the Royal Proclamation of 1763 would have had something to do with that). The French Canadians also stayed loyal to the British; in this case too, it was likely preferring “the devil you know.”
Were there other forerunners to the Articles? Mystère.
Another precursor of the Articles was the Dutch Republic’s Union of Utrecht of 1579, which created a confederation of seven provinces in the north of the Netherlands. Like that of the Iroquois, the Dutch system left each component virtually independent except for issues like the common defense. This was not a democratic system in the modern sense in that each province was controlled by a ruling clique called the regents. For affairs of common interest, the Republic had a governing body called the Staaten (parliament) with one representative from each province. Henry Hudson’s voyage in 1609 was financed by the Staaten and he loyally named the westernmost outpost of New York City Staaten Eylandt. (This is not inconsistent with the old Vaudeville joke that the name originated with the near-sighted Henry asking “Is dat an Eyelandt?!!!” in his broken Dutch.) Hudson stopped short of naming the mighty river he navigated for himself; the Native Americans called it the Mahicanituck and the Dutch simply called it the North River.
Like the American colonists but two hundred years earlier, to achieve independence the citizens of the Dutch Republic had to rebel against the mightiest empire of the time, in this case that of Philipp II of Spain. However, the Dutch Republic in its Golden Age was the most prosperous country in Europe and among the most powerful, proving its military mettle in the Anglo-Dutch Wars of the 17th century – all of which gave rise to the unflattering English language expressions Dutch Courage (bravery fueled by alcohol), Dutch Widow (a woman of ill repute), Dutch Uncle (someone not at all avuncular), Dutch Comfort (a comment like “things could be worse”) and, of course, Dutch Treat. The Dutch Republic was also remarkable for protecting civil liberties and religious freedom, keys to the domestic tranquility that did find their way into the U.S. Bill of Rights. For a painting by the Dutch Master Abraham Storck of a scene from the Four Days Battle during the Second Anglo-Dutch War, click HERE.
The Articles of Confederation were approved by the Second Continental Congress on Nov 15, 1776. Though technically not ratified by the states until 1781, the Articles steered the new country through the Revolutionary War and continued to be in force until 1789. The Articles embraced the federalism of the Iroquois Confederation and the Dutch Republic; they rejected the principle of the Divine Right of Kings in favor of republicanism and they embraced the idea of popular sovereignty affirming that power resides with the people
The Congress of the Confederation had a unicameral legislature (like the Staaten and Nebraska). It had a presiding officer referred to as the President of the United States who organized the deliberations of the Congress, but who did not have executive authority. In all, there were ten presidents, John Hancock and Richard Henry Lee among them. John Hanson, a wealthy landowner and slaveholder from Maryland, was the first president and so wags claim that “John Hanson” and not “George Washington” is the correct answer to the trivia question “who was the first U.S. president” – by the way the answer to the question “which president first called for a Day of Thanksgiving on a Thursday in November” is also “John Hanson.” For a statue of the man, click HERE
The arrangement was truly federal: each state had one vote and ordinary matters required a simple majority of the states. The Congress could not levy taxes itself but depended on the states for its revenue. On the other hand, Congress could coin money and conduct foreign policy but decisions on making war, entering into treaties, regulating coinage, and some other important issues required the vote of nine states in the Congress.
Not unsurprisingly, given the colonists’ opposition to the Royal Proclamation of 1763, during the Revolutionary War the Americans took action to wrest the coveted land west of the Appalachians away from the British. George Rogers Clark, a general in the Virginia Militia (and older brother of William of “Lewis and Clark” fame) is celebrated for the Illinois Campaign and the captures of Kaskaskia (Illinois) and Vincennes (Indiana). For the Porte de Vincennes metro stop in Paris, click HERE.
As the French, Dutch and Spanish squabbled with the English over the terms of a treaty to end the American War of Independence and dithered over issues of interest to these imperial powers ranging from Gibraltar to the Caribbean to the Spice Islands to Senegal, the Americans and the English put together their own deal (infuriating the others, especially the French). This arrangement ceded the land east of the Mississippi and South of the Great Lakes (except for Florida) to the newly born United States. The Florida territory was transferred back once again to Spain. The French had wanted all that land east of the Mississippi and West of the Appalachians to be ceded to its ally Spain who also controlled the Louisiana Territory at this time. Given how Spain returned the Louisiana Territory to France by means of a secret treaty twenty years later, the bold American diplomatic dealings in the Treaty of Paris proved to be prescient; the Americans who signed the treaty with England were Benjamin Franklin, John Adams and John Jay.
The treaty with England was signed in 1783 and ratified by the Confederation Congress then sitting at the Maryland State House in Annapolis on January 14, 1784
However, hostilities between American militias and British and Native American forces continued after Cornwallis’ defeat at Yorktown and even after the signing of the treaty that officially ended the war; in fact, the British did not relinquish Fort Detroit and surrounding settlements until Jay’s Treaty which took effect in 1796. Many thought this treaty made too many concessions to the British on commercial and maritime matters and, for his efforts, Jay was hanged and burned in effigy everywhere by anti-Federalists. Jay reportedly joked that he could find his way across the country by the light of his burning effigies. Click HERE for a political cartoon from the period.

A noted achievement of the Confederation Congress was the Ordinance of 1787 (aka the Northwest Ordinance), approved on July 13, when the Congress was seated at Federal Hall in New York City. The Northwest Territory was comprised of the five territories of Illinois, Michigan, Wisconsin, Indiana, Ohio – the elementary school mnemonic was “I met Walter in Ohio.” Four of these names are Native American in origin; Indiana is named for the Indiana Land Company, a group of real estate investors. The Ordinance outlawed slavery in these areas (but it did include a fugitive slave clause), provided a protocol for territories’ becoming states, acknowledged the land rights of Native Americans, established freedom of navigation on lakes and rivers, established the principle of public education (including universities), … . In fact no time was wasted: Ohio University was chartered in 1787; it is located in Athens (naturally) and today has over 30,000 students. The Ordinance was re-affirmed when the Constitution replaced the Articles of Confederation.

With all these successes in war and in making peace, what drove the Americans to abandon the proven formula of a confederation of tribes or provinces and seek to replace it? Again, mystère.

While the Articles of Confederation were successful when it came to waging war and preparing for new states, it was economic policy and domestic strife that made the case for a stronger central government.

Under the Articles of Confederation, the power to tax stayed with the individual states; in 1781 and again in 1786, serious efforts were made to amend the Articles so that the Confederation Congress itself could levy taxes; both efforts failed leaving the Congress without control over its own finances. During and after the war, both the Congress and individual states printed money, money that soon was “not worth a continental.”

In 1785 and 1786, a rebellion broke out in Western Massachusetts, in the area around Springfield; the leader who emerged was Daniel Shays, a farm worker who had fought in the Revolution (Lexington and Concord, Saratoga, …) and who had been wounded  – but Shays had never been paid for his service in the Continental Army and he was now being pursued by the courts for debts. He was not alone and petitions by yeoman citizens for relief from debts and taxes were not being addressed by the State Legislature. The rebels shut down court houses and tried to seize the Federal Armory in Springfield; in this, they were thwarted only by an ad hoc militia raised with money from merchants in the east of the state. After, many rebels including Shays, swiftly escaped to neighboring states such as New Hampshire and New York, out of reach for the Massachusetts militia.

Shays’ Rebellion shook the foundations of the new country and accelerated the process that led to the Constitutional Convention of 1787. It dramatically highlighted the shortcomings of such a decentralized system in matters of law and order and in matters economic; in contrast, with the new Constitution, Washington as President was able to lead troops to put down the Whiskey Rebellion and Hamilton as Secretary of the Treasury was able to re-organize the economy (national bank, assumption of states’ debts, protective tariffs, …). Click HERE for a picture of George back in the saddle; as for “Hamilton” tickets, just wait for the movie.

The Articles continued to be the law of the land into 1789: the third text of U.S. scripture, the U.S. Constitution, was ratified by the ninth state New Hampshire on June 21, 1788 and the Confederation Congress established March 4, 1789 as the date for the country to begin operating under the new Constitution.
How did that work out? More to come. Affaire à suivre.

The Declaration – U.S. Scripture I

There are three founding texts for Americans, texts treated like sacred scripture. The first is the Declaration of Independence, a stirring document both political and philosophical; in schools and elsewhere, it is read and recited with religious spirit. The second is the Articles of Confederation; a government based on this text was established by the Second Continental Congress; despite the new country’s success in waging the Revolutionary War and in reaching an advantageous peace treaty with the British Empire, this document is not venerated by Americans in quite the same way because this form of government was superseded after thirteen years by that of the third founding text, the Constitution. These three texts play the role of secular scripture in the United States; in particular, the Constitution, although only 4 pages long without amendments, is truly revered and quoted like “chapter and verse.”

Athena, the Greek goddess, is said to have sprung full grown from the head of Zeus (and in full armor, to boot); did these founding texts just emerge from the heads and pens of the founding fathers? In particular, there is the Declaration of Independence. Did it have a precursor? Was it part of the spirit of the times, of the Zeitgeist? Mystère.

Though still under British rule in 1775 when hostilities broke out at Lexington and Concord, the colonies had had 159 years of self-government at the local level: they elected law makers, named judges, ran courts and collected taxes. Forward looking government took root early in the colonies. Already in 1619, in Virginia, the House of Burgesses was set up, the first legislative assembly of elected representatives in North America; in 1620, the Pilgrims drew up the Mayflower Compact before even landing; the 1683, the colonial assembly in New York passed the Charter of Liberties. A peculiar matrix evolved where there was slavery and indentured servitude on the one hand and progress in civil liberties on the other (one example, the trial of John Peter Zenger in New York City and the establishment of Freedom of the Press).

In fact, the period from 1690 till 1763 is known as the “period of salutary neglect,” where the British pretty much left the colonies to fend for themselves – the phrase “salutary neglect” was coined by the British parliamentarian Edmund Burke, “the father of modern conservatism.” Salutary neglect was abandoned at the end of the French and Indian War (aka The Seven Years War) which was a “glorious victory” for the British but which left them with large war debts; their idea was to have the Americans “pay their fair share.”

During the run-up to the French and Indian War, at the Albany Convention in 1754, Benjamin Franklin proposed the Albany Plan, which was an early attempt to unify the colonies “under one government as far as might be necessary for defense and other general important purposes.” The main thrust was mutual defense and, of course, it would all be done under the authority of the British crown.

The Albany Plan was influenced by the Iroquois’ Great Law of Peace, a compact that long predated the arrival of Europeans in the Americas; this compact is also known as the Iroquois Constitution. This constitution provided the political basis for the Haudenosaunee, aka the Iroquois Confederation, a confederacy of six major tribes. The system was federal in nature and left each tribe largely responsible for its own affairs. Theirs was a very egalitarian society and for matters of group interest such as the common defense, a council of chiefs (who were designated by senior women of their clans) had to reach a consensus. The Iroquois were the dominant Indian group in the northeast and stayed unified in their dealings with the French, British and Americans. In a letter to a colleague in 1751, Benjamin Franklin acknowledged his debt to the Iroquois with this amazing admixture of respect and condescension:

”It would be a strange thing if six nations of ignorant savages should be capable of forming such a union, and yet it has subsisted for ages and appears indissolvable, and yet a like union should be impractical for 10 or a dozen English colonies.”

The Iroquois Constitution was also the subject of a groundbreaking ethnographic monograph. In 1724, the French Jesuit missionary Joseph-François Lafitau published a treatise on Iroquois society, Mœurs des sauvages amériquains comparées aux mœurs des premiers temps (Customs of the American Indians Compared with the Customs of Primitive Times), in which he describes the workings of the Iroquois system and compares it to the political systems of the ancient world in an attempt to establish a commonality shared by all human societies. Lafitau admired this egalitarian society where each Iroquois, he observed, views “others as masters of their own actions and of themselves” and each Iroquois lets others “conduct themselves as they wish and judges only himself.”

The pioneering American anthropologist Lewis Henry Morgan, who studied Iroquois society in depth, was also impressed with the democratic nature of their way of life, writing “Their whole civil policy was averse to the concentration of power in the hands of any single individual.” In turn, Morgan had a very strong influence on Frederick Engel’s The Origin of the Family, Private Property, and the State: in the Light of the Researches of Lewis H. Morgan (1884). Apropos of the Iroquois Constitution, Engels (using “gentile” in its root Latin sense of “tribal”) waxed lyric and exclaimed “This gentile constitution is wonderful.” Engel’s work, written after Karl Marx’s death, had for its starting point Marx’ notes on Morgan’s treatise Ancient Society (1877).

All of these European writers thought that Iroquois society was an intermediate stage in a progression subject to certain laws of sociology, a progression toward a society and way of life like their own. Of course, Marx and Engels did not think things would stop there.

At the end of the French and Indian War, the British prevented the colonists, who numbered around 2 million at this point, from pushing west over the Appalachians and Alleghenies with the Royal Proclamation of 1763; indeed, his Majesty George III proclaimed (using the royal “our”) that this interdiction was to apply “for the present, and until our further pleasure be known.” This proclamation was designed to pacify French settlers and traders in the area and to keep the peace with Native American tribes, in particular the Iroquois Confederation who, unlike nearly all other tribes, did side with the British in the French and Indian War. It particularly infuriated land investors such as Patrick Henry and George Washington – the latter, a surveyor by trade, founded the Mississippi Land Company in 1763 just before the Proclamation with the expectation of profits from investments in the Ohio River Valley, an expectation dashed by the Proclamation. Designs by Virginians on this region were not surprising given Virginia’s purported westward reach at that time (for a map, click HERE); even today, the Ohio River is the Western boundary of West Virginia. Washington recovered financially and at the time of his death was a very wealthy man.

Though the Royal Proclamation was flouted by colonists who continued to migrate west, it was the first in the series of proclamations and acts that finally outraged the colonists to the point of armed rebellion. Interestingly, in Canada the Royal Proclamation still forms the legal basis for the land rights of indigenous peoples. Doubtless, this has worked out better for both parties than the Discovery Doctrine which has been in force in the U.S. since independence – for details. click HERE .

The Proclamation was soon followed by the hated Stamp Act, which was a tax directly levied by the British government on colonists, as opposed to a tax coming from the governing body of a colony. This led to the Stamp Act Congress which was held in New York City in October 1765. It was attended by representatives from 9 colonies and famously published its Declaration of Rights and Grievances which included the key point “There should be no taxation without representation.” This rallying cry of the Americans goes back to the English Bill of Rights of 1689 which asserts that taxes can only be enacted by elected representatives: “levying taxes without grant of Parliament is illegal.”

Rumblings of discontent continued as the British Parliament and King continued to alienate the colonists.

In the spring of 1774, Parliament abrogated the Massachusetts Charter of 1691, which gave people a considerable say in their government. In September, things boiled over not in Boston, but in the humble town of Worcester. Militiamen took over the courts and in October at Town Meeting independence from Britain was declared. The Committees of Correspondence assumed authority. (For a most useful guide to the pronunciation of “Worcester” and other Massachussets place names, click HERE. By the way, the Worcester Art Museum is outstanding.)

From there, things moved quickly and not so quickly. While the push for independence was well advanced in Massachusetts, the delegates to the First Continental Congress in the fall of 1774 were not prepared to take that bold step: in a letter John Adams wrote “Absolute Independency … Startle[s] People here.”  Most delegates attending the Philadelphia gathering, he warned, were horrified by “The Proposal of Setting up a new Form of Government of our own.”

But acts of insurrection continued. For example, in December 1774 in New Hampshire, activists raided Fort William and Mary, seizing powder and weaponry. Things escalated leading to an outright battle at Lexington and Concord in April, 1775.

Two years after the First Congress, at the Second Continental Congress, the delegates were ready to move in the direction of independence, convening in Philadelphia on May 10, 1776. In parallel, on June 12, 1776 at Williamsburg, the Virginia Constitutional Convention adopted the Virginia Declaration of Rights which called for a break from the Crown and which famously begins with

Section 1. That all men are by nature equally free and independent and have certain inherent rights, of which, when they enter into a state of society, they cannot, by any compact, deprive or divest their posterity; namely, the enjoyment of life and liberty, with the means of acquiring and possessing property, and pursuing and obtaining happiness and safety.

This document was authored principally by George Mason, a planter and friend of George Washington. Meanwhile back in Philadelphia, Thomas Jefferson (who would have been familiar with the Virginia Declaration) was charged by a committee with the task of putting together a statement presenting the views of the Second Continental Congress on the need for independence from the British. After some edits by Franklin and others, the committee brought forth the founding American document, The Declaration of Independence. The 2nd paragraph of which begins with that resounding universalist sentence:

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.

And it continues

Governments are instituted among Men, deriving their just powers from the consent of the governed

The ideas expressed in the Virginia Declaration of Rights and the Declaration of Independence are radical and they had to have legitimacy. So, we are back to the original mystère, where did that legitimacy come from?

Clearly, phrases like “all men are by nature equally free and independent” and “all men are created equal” reverberate with an Iroquois, New World sensibility. Scholars also see here the hand of John Locke, most literally in the use of Locke’s  phrase “the pursuit of Happiness”; they also see the influence of Locke in the phrase “consent of the governed” – the idea of popular sovereignty being a concept closely associated with social contract philosophers of the European Enlightenment such as Locke and Rousseau.

Others also see the influence of the Scottish Enlightenment, that intellectual flowering with giants like David Hume and Adam Smith. The thinkers whose work most directly influenced the Americans include Thomas Reid (“self-evident” is drawn from the writings of this founder of the Common Sense School of Scottish Philosophy) and Francis Hutchenson (“unalienable rights” is drawn from his work on natural rights – Hutchenson is also known for his impact on his student Adam Smith, who later held Hutchenson‘s Chair of Moral Philosophy at Glasgow). Still others hear echoes of Thomas Paine and the recently published Common Sense – that most vocal voice for independence.

Jefferson would have been familiar with the Scottish Enlightenment; he also would have read the work of Locke and, of course, Paine’s pamphlet. The same most probably applies to George Mason as well. In any case, the Declaration of Independence went through multiple drafts, was read and edited by others of the Second Continental Congress and eventually was approved by the Congress by vote; so it must also have reflected a generally shared sense of political justice.

On July 1, 1776, the Second Continental Congress did take that bold step and voted in favor of Richard Henry Lee’s motion for independence from Great Britain. On July 4th, they officially adopted the Declaration of Independence.

Having declared independence from the British Crown, the representatives at the Second Continental Congress now had to come up with a scheme to unify a collection of fiercely independent political entities. Where could they have turned for inspiration and example in a political landscape dominated by powerful monarchies? Mystère. More to come.


The Rule of St Benedict goes back to the beginning of the Dark Ages. Born in the north of Italy at the end of the 5th century, at the end of the Roman Empire, Benedict is known as the Father of Western Monasticism; he laid down a social system for male religious communities that was the cornerstone of monastic life in Western Europe for a thousand years and one that endures to this day. It was a way of life built around prayer, manual work, reading and, of course, submission to authority. The Benedictine abbey was led by an abbot; under him were the priests who were the ones to copy manuscripts and to sing Gregorian chant; then there were the brothers who bore the brunt of much of the physical work and lay people who bore as much or more. The Rule of St. Benedict is followed not only by the Benedictines, the order he founded, but also by the Cistercians, the Trappists and others.

The monasteries are credited with preserving Western Civilization during the Dark Ages; copying manuscripts led to the marvelous decorative calligraphy of the Book of Kells and other masterpieces – the monks even introduced the symbol @ (the arobase, aka the “at sign”) to abbreviate the Latin preposition ad. They are also credited with sending out those fearless missionaries who brought literacy and Christianity to pagan Northern tribes, bringing new people into the orbit of Rome: among them were Winfrid (aka Boniface) from Wessex who chopped down sacred oak trees to prosyletize German tribes and Willibrord from Northumbria who braved the North Sea to convert the fearsome Frisians, destroying pagan sanctuaries and temples in the process.

The monasteries also accumulated vast land holdings (sometimes as donations from aging aristocrats who were more concerned to make a deal with God than they were for the future of their children and heirs).  With land and discipline came wealth and monasteries became the target of choice for Viking marauders. At the time of the Protestant Reformation, the monasteries fell victim to iconoclasts and plundering potentates. Henry VIII dissolved the Cistercian abbey at Rievaulx in Yorkshire and other sites, procuring jewelry for Anne Boleyn in the process. This was the kind of thing that provoked the crypto-Catholic poet Shakespeare to lament about the

Bared ruin’d choirs, where late the sweet birds sang

Even today, though in ruins, Rievaulx is magnificent as a visit to Yorkshire or a click HERE will confirm. It can also be noted that the monks at Rievaulx abandoned the Rule of St. Benedict in the 15th century and became rather materialistic; this probably made them all the more tempting a target for Henry VIII.

We also owe the great beers to the monks. Today, two of the most sought after Belgian beers in the world are Trappiste and Abbaye de Leffe. The city of Munich derives its name from the Benedictine monastery that stood in the center of town. Today’s Paulaner and Augustiner beers trace back to monasteries. But what was it that turned other-wordly monks into master brewers? Mystère.

The likely story is that it was the Lenten fast that drove the monks to secure a form of nourishment acceptable to the prying papal legates. Theirs was a liquid only fast from Ash Wednesday through Holy Saturday, including Sundays. The beers they crafted passed muster as non-solid food and were rich in nutrients. There are also stories that the strong brews would be considered undrinkable by Italian emissaries from Rome and so would be authorized to be drunk during Lent as additional mortification of the flesh.

Indeed, the followers of the Rule of St. Benedict didn’t stop there. We also owe bubbly to the Benedictines. The first documented sparkling wine was made in 1531 at the Benedictine Abbey of St. Hilaire in the town of Limoux in the south of France – close to the Mediterranean, between Carcassonne and Perpignan – a mere 12 miles from Rennes-Le-Chateau (of Davinci Code fame). Though wine-making went back centuries and occasionally wine would have a certain effervescence, these churchmen discovered something new. What was this secret? Mystère.

These Benedictine monks were the first to come upon the key idea for bubbly – a second fermentation in the bottle. Their approach is what now called the “ancestral method” – first making a white wine in a vat or barrel (“cuve” in French), interrupting the vinification process, putting it in bottles and adding yeast, grape sugar or some alcohol, corking it and letting it go through a second fermentation process in the (hopefully strong and well-sealed) bottle. This is the technique used for the traditional Blanquette de Limoux; it is also used for the Clairette de Die. The classic Blanquette de Limoux was not strong in alcohol, around 6-8 % and it was rather doux and not brut. Today’s product comes in brut and doux versions and is 12.5%  alcohol.

By the way, St. Hilaire himself was not a monk but rather a 4th century bishop and defender of the orthodoxy of the Nicene Creed (the one intoned in the Catholic and Anglican masses and other services); he is known as the “Athanasius of the West” which puts him in the company of a man with a creed all of his own – the Athanasian Creed forcefully affirms the doctrine of the Triune God and is read on Trinity Sunday.

The original ancestral method from Limoux made for a pleasant quaff, but not for the bubbly of today. Did the Benedictines come to the rescue once again? What devilish tricks did they come up with next? Or was it done by some other actors entirely? Mystère.

This time the breakthrough to modern bubbly took place in the Champagne region of France. This region is at the point where Northern and Southern Europe meet. In the late middle ages, it was one of the more prosperous places in Europe, a commercial crossroads with important fairs at cities like Troyes and Rheims. The cathedral at Rheims is one of the most stunning in Europe and the French kings were crowned there from Louis the Pious in 816 to Charles X in 1825. In fact, Rheims has been a city known to the English speaking world for so long that its name in French (Reims) has diverged from its older spelling which we still use in English. It is also where English Catholics in the Elizabethan and Jacobean periods published the Douay-Rheims translation of the Latin Vulgate.

Enter Pierre Pėrignon, son of a prosperous bourgeois family, who in 1668 joined the Benedictine monastery at St. Pierre de Hautvillers. The order by that time deigned to accept non-noble commoners as monks and would dub them dom, a title drawn from the Latin word for lord, dominus, to make up for their lack of a traditional aristocratic handle.

Dom Pėrignon was put in charge of wine making and wine storage, both critical to the survival of the monastery which had fallen on hard times with only a few monks left and things in a sorry state. The way the story is told in France, it was during a pilgrimage south and a stay with fellow Benedictines at the monastery of St. Hilaire in Limoux that he learned the ancestral method of making sparkling wine. However he learned of their techniques, he spent the rest of his life developing and perfecting the idea. By the time of his death in 1715, champagne had become the preferred wine at the court of Louis XIV and the wine of choice of the fashionable rich in London. Technically, Dom Pėrignon was a master at choosing the right grapes to blend to make the initial wine; then he abandoned the ancestral technique of interrupting the fermentation of the wine in the cuve or vat and let the wine complete its fermentation; next, to deal with the problem of dregs caused by the second fermentation in the bottle, he developed the elaborate practice of rotating each bottle by a quarter turn, a step repeated every day for two or more months and known as the remuage (click HERE for an illustration); it is said that professionals can do 40,000 bottles a day.

To all that, one must add that he found a solution to the “exploding bottle problem”; as the pressure of the CO2 that creates the bubbles builds up during the fermentation in the bottle, bottles can explode spontaneously and even set off a chain reaction. To deal with this, Dom Pėrignon turned to bottle makers in London who could make bottles that could withstand the build-up of all that pressure. Also, that indentation in the bottom of the bottle (the punt or kick-up in English, cul in French) was modified; better corks from Portugal too entered into it.

Putting all this together yielded the champagne method. Naturally, wine makers in other parts of France have applied this process to their own wines, making for some excellent bubbly; these wines used to carry the label “Mėthode Champenoise” or “Champagne Method.” While protection of the right to the term Champagne itself is even included in the Treaty of Versailles, more recently, a restriction was made to the effect that only wines from the Champagne region could even be labeled “Champagne Method.” So the other wines produced this way are now called crėmants (a generic term for sparkling wine). Thus we have the Crėmant d’Alsace, the Crėmant de Bourgogne, the Crėmant de Touraine and even the Crėmant de Limoux. All in all, these crėmants constitute the best value in French wines on the market. Other countries have followed the French lead and do not use the label “Champagne” or “Champagne Method” for sparkling wines; even the USA now follows the international protocol (although wines labeled Champagne prior to 2006 are exempt).

Admittedly, champagne is a marvelous beverage. It has great range and goes well with steak and with oysters. The sound of the pop of a champagne cork means that the party is about to begin.  Of course, one key thing is that champagne provides a very nice “high”; and it does that without resorting to high levels of alcoholic content. Drolly, at wine tastings and similar events, the quality of the “high” provided by the wine is never discussed directly – instead they skirt around it talking about legs and color and character and what not, while the key thing is the quality of the “high” and the percentage of alcohol in the wine. So how do you discuss this point in French itself? Mystère. In effect, there is no way to deal with all this in French, la Langue de Voltaire. The closest you can come to “high” is “ivresse” but that has a negative connotation; you might force the situation and try something like “douce ivresse” but that doesn’t work either. Tis a mystère without a solution, then. But it is the main difference between a $60 bottle of Burgundy and a $15 bottle of Pinot Noir – do the experiment.

There are some revisionist historians who try to diminish the importance of Dom Pėrignon in all this. But he has been elevated to star status as an icon for the Champagne industry. As an example, click HERE for a statue in his honor erected by Moȅt-Chandon; click HERE for an example of an advertisement they actually use to further sales.

So the secret to the ancestral method and the champagne method is that second fermentation in the bottle. But now we have popular alternatives to champagnes and crėmants in the form of Prosecco and Cava. If these very sparkling wines are not made with the champagne method, what is it then? Mystère.

In fact, the method used for these effervescent wines does not require that second fermentation in the bottle, a simplification made possible by the invention of stainless steel toward the end of the 19th century. This newer method carries out the secondary fermentation in closed stainless steel tanks that are kept under pressure. This simplifies the entire production process and makes the bubbly much less expensive to make. The process was first invented by Frederico Martinotti in Italy, then improved by Eugène Charmat in France – all this in the 1890’s. So it is known as the metodo italiano in Italy and the Charmat method most elsewhere. In the 1960’s things were improved further to allow for less doux, more brut wines. They are now extremely popular and considered a good value by the general wine-loving public.

Finally, there is the simplest method of all to make sparkling wine or cider – inject carbon dioxide directly into the finished wine or cider, much like making seltzer water from plain water with a SodaStream device. In fact, this is the method used for commercial ciders. Please don’t try this at home with a California chardonnay if you don’t have strong bottles, corks and protective wear.

Indeed, beer and wine have played an important role in Western Civilization; wine is central to rituals of Judaism and Christianity; the ancient Greeks and Romans even had a god of wine. In fact, beer and wine go back to the earliest stages of civilization. When hunter gatherers metamorphosed into farmers during the Neolithic Revolution and began civilization as we know it, they traded a life-style where they were taller, healthier and longer-lived for one in which they were shorter, less healthy, had a shorter life span and had to endure the hierarchy of a social structure with priests, nobles, chiefs and later kings.  On the other hand, archaeological evidence often points to the fact that one of the first things these farmers did do was to make beer by fermenting grains. Though the PhD thesis hasn’t been written yet, it is perhaps not unsafe to conclude that fermentation made this transition bearable and might even have been the start of it.




North America III

When conditions allowed, humans migrated across the Bering Land Bridge moving from Eurasia to North America thousands of years before the voyages of discovery of Columbus and other European navigators. That raises the question whether there were Native Americans who encountered Europeans before Columbus. If so, were these encounters of the first, second or third kind? Mystère.

For movement west by Europeans before Columbus, first we have the Irish legend of the voyage of St. Brendan the Navigator. Brendan and his 16 mates (St. Malo among them) sailed in a currach – an Irish fisherman’s boat with a wooden frame over which are stretched animal skins (nowadays they use canvas and sometimes anglicize the name to curragh). These seafarers reportedly reached lands far to the West, even Iceland and Newfoundland in the years 512-530 A.D. All this was presented as fact by nuns in parochial schools of yore, but, begorrah, there is archaeological evidence of the presence of Irish visitors on Iceland before any Viking settlements there. Moreover, in the 1970’s the voyage of St. Brendan was reproduced by the adventurer Tim Severin and his crew, which led to a best-selling book The Brendan Voyage, lending further credence to the nuns’ version of history. However, there is no account in the legends of any contact with new people; for contact with a mermaid, click HERE.

In the late 9th century, Viking men accompanied by (not so willing) Celtic women reached Iceland and established a settlement there (conventionally dated as of 874 A.D.). Out of Iceland came the Vinland Saga of the adventures of Leif Erikson (who converted to Christianity and so gained favor with the later saga compilers) and of his fearless sister Freydis Eriksdottir (who also led voyages to Vinland but who stayed true to her pagan roots). It has been established that the Norse did indeed reach Newfoundland and did indeed start to found a colony there; the site at L’Anse-aux-Meadows has yielded abundant archaeological evidence of this – all taking place around 1000 A.D.  The Norse of the sagas called the indigenous people they encountered in Vinland the Skraeling (which can be translated as “wretched people”). These people were not easily intimidated; there were skirmishes and more between the Skraeling (who did have bows and arrows) and the Norse (who did not have guns). In one episode, Freydis grabs a fallen Viking’s sword and drives the Skraeling attackers off on her own – click HERE for a portrait of Freydis holding the sword to her breast.

Generally speaking, the war-like nature of the Skraeling is credited with keeping the Vikings from establishing a permanent beach head in North America. So these were the first Native Americans to encounter Europeans, solving one mystery. But exactly which Native American group were the Skraeling? What is their migration story? Again mystère.

The proto-Inuit or Thule people, ancestors of the modern Inuit, emerged in Alaska around 1000 B.C. Led by their Sled Dogs, they “quickly” made their way east across Arctic Canada, then down into Labrador and Newfoundland. The proto-Inuit even made their way across the Baffin Bay to Greenland. What evidence there is supports the idea that these people were the fierce Skraeling of the sagas.

As part of the movement West, according to the sagas, Norse settlers came to Greenland around 1000 A.D. led by the notorious Erik the Red, father both of Leif and Freydis. Settlers from Norway as well as Iceland joined the Greenland colony and it became a full blown member of medieval Christendom, what with churches, a monastery, a convent and a bishop. In relatively modern times, around 1200 A.D., the proto-Inuit reached far enough South in Greenland to encounter the Europeans there. Score one more for these intrepid people. These are the only two pre-Columbian encounters between Europeans and Native Americans that are well established.

In the end, though, the climate change of the Little Ice Age (which began around 1300 A.D.) and the Europeans’ impact on the environment proved too much and the Greenland colony died out sometime in the 1400’s. The proto-Inuit population with their sled dogs stayed the course (though not without difficulty) and survived. As a further example of the impact of the climate on Western Civilization, the Little Ice Age practically put an end to wine making in the British Isles; the crafty and thirsty Scots then made alcohol from grains such as barley and Scotch whisky was born.

The success of the proto-Inuit in the Arctic regions was based largely on their skill with the magnificent sled dogs that their ancestors had brought with them from Asia. The same can be said for the Aleut people and their Alaskan Malamute Dog. Both these wolf-like marvels make one want to read or reread The Call of the Wild; for a Malamute picture, click HERE.

We know the Clovis people and other populations in the U.S. and Mexico also had dogs that had come from Siberia. But today in the U.S. we are overwhelmed with dogs of European origin – the English Sheepdog, the Portuguese Water Dog, the Scottish Terrier, the Irish Setter, the French Poodle, the Cocker Spaniel, and on and on. What happened to the other dogs of the Native Americans? Are they still around? Mystère.

The simple answer is that, for the most part, in the region south of the Arctic, the native dogs were simply replaced by the European dogs. However, for the lower 48, it has been established recently that the Carolina Dog, a free range dingo, is Asian in origin. In Mexico, the Chihuahua is the only surviving Asian breed; the Xoloitzcuintli (aka the Mexican Hairless Dog) is thought to be a hybrid of an original Asian dog and a European breed.  (Some additional Asian breeds have survived in South America.)

Still it is surprising that the native North Americans south of the Arctic regions switched over so quickly and so completely to the new dogs. A first question that comes to mind is whether or not these two kinds of dogs were species of the same animal; this question can’t be answered since dogs and wolves are already hard to distinguish – they can still interbreed and their DNA split only goes back at most 30,000 years. A more tractable formulation would be whether or not the Asian dogs and the European dogs are the issue of the same domestication event or of different domestication events. However, “it’s complicated.” One sure thing we know comes from the marvelous cave at Chauvet Pont d’Arc in central France where footprints of a young child walking side by side with a dog or wolf have been found which date back some 26,000 years ago. This would point to a European domestication event; however, genetic evidence tends to support an Asian origin for the dog. Yet another theory backed by logic and some evidence is that both events took place but that subsequently the Asian dogs replaced the original European dogs in Western Eurasia.

For one thing, the dogs that came with the Native Americans from Siberia were much closer to the dogs of an original Asian self-domestication that took place in China or Mongolia (according to PBS); in any case, they would not have gone through as intense and specialized a breeding process as would dogs in populous Europe, a process that made the European dogs more useful to humans south of the Arctic and more compatible with domestic animals such as chickens, horses and sheep – until the arrival of Europeans the Native Americans did not have domestic animals and did not have resistance to the poxes associated with them.

The role of dogs in the lives of people grows ever more important and dogs continue to take on new work roles – service dogs, search and rescue dogs, guide dogs, etc. People with dogs reportedly get more exercise, get real emotional support from their pets and live longer. And North America has given the world the wonderful Labrador Retriever. The “Lab” is a descendant of the St John’s Water Dog which was bred in the Province of New Foundland and Labrador; this is the only Canadian province to bear a Portuguese name – most probably for the explorer João Fernandes Lavrador who claimed the area for the King of Portugal in 1499, an area purportedly already well known to Portuguese, Basque and other fearless fishermen who were trolling the Grand Banks before Columbus (but we have no evidence of encounters of such fishermen with the Native Americans). At one point later in its development, the Lab was bred in Scotland by the Duke of Buccleuch, whose wife the Duchess was Mistress of the Robes for Queen Victoria (played by Diana Rigg in the PBS series Victoria – for the actual duchess, click HERE). From Labrador and Newfoundland, the Lab has spread all over North America and is now the most popular dog breed in the U.S. and Canada – and in the U.K. as well.

North America II

At various times in history, the continents of Eurasia and North America have been connected by the Bering Land Bridge which is formed when water levels recede and the Bering Sea is filled in (click  HERE for a dynamic map that shows changes in sea level over the last 21,000 years).

When conditions allowed, humans (with their dogs) migrated across the Bering Land Bridge moving from Eurasia to North America. It is not certain exactly when this human migration began and when it ended but a typical estimated range is from 20,000 years ago to 10,000 years ago. It has also been put forth that some of these people resorted to boats to ferry them across this challenging, changing area.

DNA analysis has verified that these new arrivals came from Siberia. It has also refuted Thor Heyerdahl’s “Kon Tiki Hypothesis” that Polynesia was settled by rafters from Peru – Polynesian DNA and American DNA do not overlap at all. On the other hand, there is DNA evidence that some of the trekkers from Siberia had cousins who lit off in the opposite direction and became the aboriginal people of New Guinea and of Australia!

The Los Angeles area is famous for its many attractions. Among them is the site of the La Brea Tar Pits, click HERE for a classic image. This is the resting place of countless animals who were sucked in to the primeval tar ooze here over a period of thousands and thousands of years. What is most striking is that so many of them have gone extinct, especially large animals such as the camel, the horse, the dire wolf, the giant sloth, the American lion, the sabre-toothed tiger, … .  In fact, with the exception of the jaguar, the musk ox, the moose, the caribou and the bison, all the large mammals of North America disappeared by 8,000 years ago. Humans arrived in North America not so many years before and we know they were successful hunters of large mammals in Eurasia. So, as in radio days, the $64 question is: did humans cause the extinction of these magnificent animals in North America? Mystère.

The last Ice Age lasted from 110,00 years ago to 11,700 years ago (approximately, of course).  During this period, glaciers covered Canada, Greenland and states from Montana to Massachusetts. In fact, the Bering Land Bridge was created when sea levels dropped because of the massive accumulation of frozen water locked in the glaciers. The glaciers didn’t just sit still; rather they moved South, receded to the North and repeated the cycle multiple times. In the process, they created the Great Lakes, flattened Long Island, and amassed that mound of rubble known as the Catskill Mountains – technically the Catskills are not mountains but rather a mass of monadnocks. In contrast, the Rockies and Appalachians are true mountains, created by mighty tectonic forces that thrust them upward. In any case, South of the glaciers, North America was home to many large mammal species.

As the Ice Age ended, much of North America was populated by the hunting group known as the Clovis People. Sites with their characteristic artifacts have been excavated all over the lower 48 and northern Mexico. For a picture of their characteristic spear head, click HERE. The term Clovis refers to the town of Clovis NM, scene of the first archaeological dig that uncovered this culture; luckily this dig preceded any work at the nearby town of Truth Or Consequences NM.

The Clovis people, who dominated North America south of the Arctic at the time of these mass extinctions, were indeed hunters of large and small mammals and that has given rise to the “Clovis overkill hypothesis” – that it was the Clovis people who hunted the horse and other large species to extinction. The hypothesis is supported by the fact that Clovis sites have been found with all kinds of animal remains – mammoths, horses, camels, sloths, tapirs and other species.

For one thing, these animals did not co-evolve with humans and had not honed defensive mechanisms against them – unlike, say, the Zebra in Africa (descendants of the original North American horse) who are aggressive toward humans; the Zebra have neither been hunted to extinction nor domesticated. In fact, the sociology of grazing animals such as the horse can work against them when pursued by hunters. The herd provides important defense mechanisms – mobbing, confusing, charging, stampeding, plus the presence of bulls. But since these animals employ a harem system and the herd has many cows per bull, the solitary males or the males in very small groups who are not part of the herd are natural targets for hunting. Even solitary males who use speed of flight as their first line of defense can fall victim to persistence hunting, a tactic known to have been used by humans – pursuing prey relentlessly until it is immobile with exhaustion:  a horse can run the 10 furlongs of the Kentucky Derby faster than a human in any weather but on a hot day a sweating human can beat a panting horse in a marathon. An example of persistence hunting in more modern times is stag hunting on horseback with yelping dogs – the hunts ends when the stag is exhausted and immobile and the coup de grace is administered by the Master of the Hunt. As for dogs, they were new to the North American mammals and certainly would have aided the Clovis people in the hunt.

Also there is a domino effect as an animal is hunted to extinction: the predator animals that depend on it are also in danger. By way of example, it is thought that the sabre-toothed tiger disappeared because its prey the mammoth went extinct.

Given all this, how did the bison and caribou survive?  In fact, for the bison, it was quite the opposite: there was a bison population explosion. Given that horses and mammoths have the same diet as bison, some scientists postulate that that competition with the overly successful bison drove the others out. Another thing is that bison live in huge herds while animals like horses live in small bands. It is theorized that the caribou, who also travel in massive herds, survived by pushing ever further north into the arctic regions where ecological conditions were less hostile for them and less hospitable to humans and others.

However, all this is happening at a dramatic period, the end of the Ice Age. So warming trends, the end of glaciation and other environmental changes can have contributed to this mass extinction: open spaces were replaced by forests reducing habitat, heavy coats of fur became a burden, … In fact, the end of the last Ice Age is also the end of the much longer Pleistocene period; this was followed by the much warmer Holocene period which is the one we are still in today. So the Ice Age and the movement of the glaciers suddenly ended; this was global warming to a degree that would not be seen again until the present time. The warming that followed the Ice Age would also have changed the ecology of insects, arachnids, viruses et al, with a potentially lethal impact on plant life and on mega fauna. Today we are witnessing a crisis among moose caused by the increase of the winter tick population which is no longer kept in check by cold winters. We are also seeing insects unleashed to attack trees. Along the East Coast, it is the southern pine beetle which has now reached New England – on its Shermanesque march north, this beetle has destroyed forests, woods, groves, woodlands, copses, thickets and stands of once proud pine trees. It is able to move north because the minimum cold temperature in the Mid-Altantic states has warmed by about 7 degrees Fahrenheit over the last 50 years. In Montana and other western states it is the mountain pine beetle and the endless fires that are destroying forests.

Clearly, the rapidly occurring and dramatic transformations at the end of the Ice Age could have disrupted things to the point of causing widespread extinctions – evolution did not have time to adjust.

And then there is this example where the overkill hypothesis is known not to apply. The most recent extinction of mammoths took place on uninhabited Wrangel Island in the Arctic Ocean off the Chukchi Peninsula in Siberia, only 4,000 years ago, and that event is not attributed to humans in any way. The principal causes cited are environmental factors and genetic meltdown – the accumulation of bad traits due to the small size of the breeding population

In sum, scientists make the case that climate change and environmental factors were the driving forces behind these extinctions and that is the current consensus.

So it seems that the overkill hypothesis is an example of the logical fallacy “post hoc ergo propter hoc” – A happened after B, therefore B caused A. By the way, this fallacy is the title of the 2nd episode of West Wing where Martin Sheen as POTUS aggressively invokes it to deconstruct his staff’s analysis of his electoral loss in Texas; in the same episode, Rob Lowe shines in a subplot involving a call-girl who is working her way through law school! Still, the circumstantial evidence is there – humans, proven hunters of mammoths and other large fauna, arrive and multiple large mammals disappear.

The situation for the surviving large mammals in North America is mixed at best. The bison are threatened by a genetic bottleneck (a depleted gene pool caused by the Buffalo Bill era slaughter to make the West safe for railroads), the moose by climate change and tick borne diseases, the musk ox’s habitat is reduced to arctic Canada, the polar bear and the caribou have been declared vulnerable species, the brown bear and its subspecies the grizzly bear also range over habitats that have shrunk. The fear is that human involvement in climate change is moving things along so quickly that future historians will be analyzing the current era in terms of a new overkill hypothesis.

North America I

Once upon a time, there was a single great continent – click HERE – called Pangaea. It formed 335 million years ago, was surrounded by the vast Panthalassic Ocean and only began to break apart about 175 million years ago. North America dislodged itself from Pangaea and started drifting west; this went on until its plate rammed into the Pacific plate and the movement dissipated but not before the Rocky Mountains swelled up and reached great heights.

As the North American piece broke off, it carried flora and fauna with it. But today we know that many land species here did not come on that voyage from Pangaea; even the iconic American bison (now the National Mammal) did not originate here. How did they get to North America? Something of a mystère.

Today the Bering Strait separates Alaska from the Chukchi Peninsula in Russia but, over the millennia, there were periods when North America and Eurasia were connected by a formation known as the Bering Land Bridge, aka Beringia. Rises and ebbs in sea level due to glaciation, rather than to continental drift, seem to be what creates the bridge. When the land bridge resurfaces, animals make their way from one continent to the other in both directions.

Among the large mammals who came from Eurasia to North America by means of the Bering Land Bridge were the musk ox (click HERE), the steppe mammoth, the steppe bison (ancestor of our bison), the moose, the cave lion (ancestor of the American lion) and the jaguar. The steppe mammoth and the American lion are extinct today.

Among the large mammals native to North America were the Columbian mammoth, the sabre-toothed tiger, and the dire wolf. All three are now extinct; for an image of the three frolicking together in sunny Southern California, click HERE .

The dire wolf lives on, however, in the TV series Game of Thrones and this wolf is the sigil of the House of Stark; so it must also have migrated from North America to the continent of Westeros – who knew !

Also among the large mammals native to North America are the short-faced bear, the tapir the brown bear, and the caribou (more precisely, this last  is native to Beringia). The first two are sadly extinct today.

In school we all learned how the Spanish conquistadors brought horses to North America, how the Aztecs and Incas had never seen mounted troops before and how swiftly their empires fell to much smaller forces as a result. An irony here is that the North American plains were the homeland of the species Equus and horses thrived there until some 10,000 years ago. Indeed, the horse and the camel are among the relatively few animals to go from North America to the more competitive Eurasia; these odd-toed and even-toed ungulates prospered there and in Africa – even Zebras are descended from the horses that made that crossing.  The caribou also crossed to Eurasia where they are known as reindeer.

Similarly, South America split off from Pangaea and drifted west creating the magnificent range of the Andes Mountains before stopping.

The two New World continents were not connected until volcanic activity and plate tectonics created the Isthmus of Panama about 2.8 million years ago – some say earlier (click HERE). The movement of animals between the American continents via the Isthmus of Panama is called the Great American Interchange. Among the mammals who came from South America to North America were the mighty ground sloth (click HERE ).

This sloth is now extinct but some extant smaller South American mammals such as the cougar, armadillo, porcupine and opossum also made the crossing. The opossum and the ground sloth are both marsupials; before the Great American Interchange, there were no marsupials in North America as there are none in Eurasia or Africa.

The camel, the jaguar, the tapir, the short-faced bear and the dire wolf made their way across the Isthmus of Panama to South America. The camel is the ancestor of today’s llamas, vicuñas, alpacas and guanacos. The jaguar and tapir have found the place to their liking, the short-faced bear has evolved into the spectacled bear but the dire wolf is not found there today; it is not known if it has survived on the fictional continent of Essos.

The impact of the movement of humans and dogs into North America is a subject that needs extensive treatment and must be left for another day (post North America III). But one interesting side-effect of the arrival of humans has been the movements of flora in and out of North America. So grains such as wheat, barley and rye have been brought here from Europe, the soybean from Asia, etc.. In the other direction, pumpkins, pecans, and cranberries have made their way from North America to places all over the planet. Two very popular vegetables, that originated here and that have their own stories, are corn and the sweet potato.

North America gave the world maize. This word came into English in the 1600s from the Spanish maiz which in turn was based on an Arawak word from Haiti. The Europeans all say “maize” but why do Americans call it “corn”? Mystère.

In classical English, corn was a generic term for the locally grown grain – wheat or barley, or rye … . Surely Shakespeare was not thinking of maize when he included this early version of Little Boy Blue in King Lear where Edgar, masquerading as Mad Tom, recites

Sleepest or wakest thou, jolly shepherd?

Thy sheep be in the corn.

And for one blast of thy minikin mouth,

Thy sheep shall take no harm.

So naturally the English colonists called maize “Indian corn” and the “Indian” was eventually dropped for maize in general – although Indian corn is still in widespread use for a maize with multicolored kernels, aka flint corn. If you want to brush up on your Shakespeare, manikin is an archaic word that came into English from the Dutch minneken and that means small.

That other culinary gift of North America whose name has a touch of mystère is the sweet potato. In grocery stores, one sees more tubers labeled yam than one does sweet potato. However, with the rare exception of imported yams, all these are actually varieties of the sweet potato. The yam is a different thing entirely – it is a perennial herbaceous vine, while a sweet potato is in the morning glory family (and is the more nutritious vegetable). The yam is native to Africa and the term yam was probably brought here by West Africans.

The sweet potato has still another language problem: it was the original vegetable that was called the batata and brought back to Spain early in the 1500’s; later the name was usurped by that Andean spud which was also called a batata, and the original vegetable had to add “sweet” to its name making it a retronym (like “acoustic guitar” , “landline phone” and “First World War”).

Gold Coin to Bit Coin

The myth of Midas is about the power of money – it magically transforms everything in its path and turns it into something denominated by money itself. The Midas story comes from the ancient lands of Phrygia and Lydia, in western modern day Turkey, close to the Island of Lesbos and to the Ionian Greek city Smyrna where Homer was born. It was the land of a people who fought Persians and Egyptians, who had an Indo-European language, and who were pioneering miners and metal workers. It was the land of Croesus, the King of Sardis, the richest man in the world whose wealth gave rise to the expression “as rich as Croesus.” Click HERE for a map.
Croesus lived in the 6th century B.C., a century dominated by powerful monarchs such as of Cyrus the Great of Persia (3rd blog appearance), Tarquin the Proud of Rome, Hammurabi II of Babylon, Astyages of the Medes, Zedekiah the King of Judah, the Pharaoh Amasis II of Egypt, Hamilcar I of Carthage. How could the richest man in the world at that time be a king from a backwater place such as Sardis in Lydia? Mystère.
In the time of Achilles and Agamemnon in the Eastern Mediterranean as in the rest of the world, goods and services were exchanged by barter. Shells, beads, cocoa beans and other means were also used to facilitate exchange, in particular to round things out when the goods bartered didn’t quite match up in value.
So here is the simple answer to our mystère: The Lydians introduced coinage to the world, a first in history, and thus invented money and its magic – likely sometime in the 8th century B.C. Technically, the system they created is known as commodity money, one where a valuable commodity such as gold or silver is minted into a standardized form.
Money has two financial functions: exchange (buying and selling) and storage (holding on to your wealth until you are ready to spend it or give to someone else). So it has to be easy to exchange and it has to hold its value over time. The first Lydian coins were made from an alloy of gold and silver that was found in the area; later technology advances meant that by Croesus’ time, coins of pure gold or pure silver could be struck and traded in the marketplace. All this facilitated commerce and led to an economic surge which made Croesus the enduring personification of wealth and riches. Money has since has made its way into every nook and cranny of the world, restructuring work and society and creating some of the world’s newest professions along the way; money has evolved dramatically from the era of King Croesus’ gold coin to the present time and Satoshi Nakamoto’s Bitcoin.
The Lydian invention was adopted by the city states of Greece. Athens and other cities created societies built around the agora, the market-place, as opposed to the imperial societies organized around the palace with economies based on in-kind tribute – taxes paid in sacks of grain, not coin. These Greek cities issued their own coins and created new forms of political organization such as democratic government. This is a civilization far removed from the warrior world of the Homeric poems. The agora model spread to the Greek cities of Southern Italy such as Crotone in Calabria known for Pythagoras and his Theorem, such as Syracuse in Sicily renowned for Archimedes and his “eureka moment.” Further north in Rome, coinage was introduced to facilitate trade with Magna Graecia as the Romans called this region.
The fall of Rome came about in part because the economy collapsed under the weight of maintaining an overextended and bloated military. Western Europe then fell into feudalism which was only ended with the growth of towns and cities in the late middle ages: new trading networks arose such as the Hanseatic League, the Fairs of the Champagne region (this is before the invention of bubbly), and the revival of banking in Italy that fueled the Renaissance. The banks used bills of exchange backed by gold. So this made the bill of exchange a kind of paper money, but it only circulated within the banking community.
Banks make money by charging interest on loans. The Italian banks did not technically charge interest because back then charging interest was the sin of usury in the eyes of the Catholic Church – as it still is in Sharia Law. Rather they side-stepped this prohibition with the bills of exchange and took advantage of exchange rates between local currencies – their network could transfer funds from London to Lyons, from Antwerp to Madrid, from Marseilles to Florence, … . The Christian prohibition against interest actually goes back to the Hebrew Bible, but Jewish law itself only prohibits charging interest on loans made to other Jews. Although the reformers Luther and Calvin both condemned charging interest, the Protestant world as well as the Church of Rome eventually made their peace with this practice.
In China also, paper exchange notes appeared during the Tang Dynasty (618 A.D. -907 A.D.) and news of this wonder was brought back to Europe by Marco Polo; perhaps that is where the Italian bankers got the idea of replacing gold with paper. Interestingly, the Chinese abandoned paper entirely in the mid 15th century during a bout of inflation and only re-introduced it in recent times.
Paper money that is redeemable by a set quantity of a precious metal is an example of representative currency. In Europe, the first true representative paper currency was introduced in Sweden in 1661 by the Stockholm Bank. Sweden waged major military campaigns during the Thirty Years War which lasted until 1648 and then went on to a series of wars with Poland, Lithuania, Russia and Denmark that only ended in 1658. The Stockholm bank’s mission was to reset the economy after all these wars but it failed after a few years simply because it printed much more paper money than it could back up with gold.
The Bank of England was created in 1694 in order to deal with war debts. It issued paper notes for the British pound backed by gold and silver. The British Pound became the world’s leading exchange currency until the disastrous wars of the 20th century.
The trouble with gold and silver is that supply is limited. The Spanish had the greatest supply of gold and silver and so the Spanish peso coin was the most widespread and the most used. This was especially the case in the American colonies and out in the Pacific. In the English speaking colonies and ex-colonies, pesos were called “pieces of 8” since a peso was equivalent to 8 bits, the bit being a Spanish and Mexican coin that also circulated widely. The story of the term dollar itself begins in the Joachimsthal region of Bohemia where the coin called the thaler (Joachim being the father of Mary, thal being German for valley) was first minted in 1517 by the Hapsburg Empire; it was then imitated elsewhere in Europe including in Holland where the daler was introduced and in Scotland where a similar coin took the name dollar. The daler was the currency of New Amsterdam and was used in the colonies. The Spanish peso, for its part, became known as “the Spanish dollar.” After independence, in the U.S. the dollar became the official currency and dollar coins were minted – paper money (except for the nearly worthless continentals of the Revolutionary War) would not be introduced in the U.S. until the Civil War. En passant, it can be noted that the early U.S. dollar was still thought of as being worth 8 bits and so “2 bits” became a term for a quarter dollar. (Love those fractions, 2/8 = 1/4). The Spanish dollar also circulated in the Pacific region and the dollar became the name of the currencies of Hong Kong, Australia, New Zealand, … .
During the Civil War, the Lincoln government introduced paper currency backed by a quantity of gold to help finance the war. Before long, Lincoln lowered this quantity because of the cost of the war, debasing the dollar in the process.
The money supply is limited by the amount of gold a government has in its vaults; this can have the effect of obstructing commerce. In the period leading up to World War I, in order to get more money into the economy, farmers in the American mid-west and west militated for silver to back the dollar at the ratio of 16 ounces of silver to 1 ounce of gold. The Wizard of Oz, written in support of this movement, is an allegory where the Cowardly Lion is really William Jennings Bryan, the Scarecrow is the American farmer, the Tin Man is the American worker; and the Wizard is Mark Hanna, the Republican political kingmaker; Dorothy’s magic slippers are silver not red in the book. Unlike the book, in the real world, the free silver movement failed to achieve its objective.
Admittedly, this is a long story – but soldier on, Bitcoin is coming up soon.
Needless to say, World War I had a terrible effect on currencies, especially in Europe where the German mark succumbed to hyper-inflation and money was worth less than the cost of printing it. This would have tragic consequences.
In the U.S. during the depression, the Roosevelt government made some very strong moves. It split the dollar into two – the domestic dollar and the international dollar; the domestic dollar was disconnected from gold completely while the dollar for international payments was still backed by gold but at $21 dollars the ounce, a significant devaluation. A paper currency, like Roosevelt’s domestic dollar, which is not backed by a commodity is called a fiat currency – meaning, in effect, that the currency’s value is declared by the issuer – prices then go up and down according to the supply of the currency and the demand for the currency. To increase the U.S. treasury’s supply of gold, as part of this financial stratagem, the government ordered the confiscation of all privately held gold (bullion, coin, jewelry, … ) with some small exceptions for collectors and dealers. Today, it is hard to imagine people being called upon to bring in the gold in their possession, have it weighed and then be reimbursed in paper money by the ounce.
Naturally, World War II also wreaked havoc with currencies around the world. The Bretton Woods agreement made the post-war U.S. dollar the currency of reference and other currencies were evaluated vis-à-vis the international (gold backed) U.S. dollar. So even the venerable British Pound became a fiat currency in 1946.
The impact of war on currency was felt once again in 1971 when Richard Nixon, with the U.S. reeling from the cost of the war in Vietnam, disconnected the dollar completely from gold making the almighty dollar a full fiat currency.
Soapbox Moment: The impact of war on a nation and the nation’s money is a recurring theme. Even Croesus lost his kingdom when he was killed in a battle with the forces of Cyrus the Great. Joan Baez and Marlene Dietrich both sang “When will they ever learn?” beautifully in English and even more plaintively in German, but men simply do not learn. Louis XIV said it all on his death bed: the Sun King lamented how he had wrecked the French economy and declared “I loved war too much.” Maybe we should all read or re-read Lysistrata by Aristophanes, the caustic playwright of 5th century Athens.
Since World War II, the world of money has seen many innovations, notably credit cards, electronic bank transfers and the relegation of cash to a somewhat marginal role as the currency of the poor and, alas, the criminal. Coupons and airline miles are examples of another popular form of currency, know as virtual currency; this form of currency has actually been around for a long time – C.W. Post distributed a one cent coupon with each purchase of Grape Nuts flakes as far back as 1895.
The most recent development in the history of money has been digital currency which is completely detached from coin, paper or even government – its most celebrated implementation being Bitcoin. A bitcoin has no intrinsic value; it is not like gold or silver or even paper notes backed by a precious metal. It is like a fiat currency but one without a central bank to tell us what it is worth. Logically, it should be worthless. But a bitcoin sells for thousands of dollars right now; it trades on markets much like mined gold. Why? Mystère.
A bitcoin’s value is determined by the marketplace: its worth is its value as a medium of exchange and its value as a storage medium for wealth. But Bitcoin has some powerful, innovative features that make it very useful both as a medium of exchange and as a medium of storage; its implementation is an impressive technological tour de force.
In 2008, a pdf file entitled “Bitcoin: A Peer-to-Peer Electronic Cash System” authored by one Satoshi Nakamoto, was published on-line; click HERE for a copy. “Satoshi Nakamoto” then is the handle of the founder or group of co-founders of the Bitcoin system (abbreviated BTC) which was launched in January 2009. BTC has four special features:
• Unlike the Federal Reserve System, unlike the Bank of England, it is decentralized. There is no central computer server or authority overseeing it. It employs the technology that Napster made famous, peer-to-peer networking; individual computers on the network communicate directly to one another without passing through a central post office; Bitcoin electronic transfers are not instantaneous but they are very, very fast compared to traditional bank transfers – SWIFT and all that.
• BTC guarantees a key property of money: the same bitcoin can not be in two different accounts and an account cannot transfer the same bitcoin twice – this is the electronic version of “you can’t spend the same dollar twice.” This also makes it virtually impossible to counterfeit a bitcoin. This is achieved by means of a technical innovation called the blockchain, which is a concise and efficient way of keeping track of bitcoins’ movements over time (“Bob sent Alice 100 bitcoins at noon GMT on 01/31/2018 … ”; it is a distributed, public account book – “ledger” as accountants like to say. A data compression method called hashing is employed to keep the size of the blockchain under control. Blockchain technology itself has since been adopted by tech companies such as IBM and One Network Enterprises.
• BTC guarantees that bitcoin transfers are secret, known only to the sender and the receiver. For this, in addition to hashing, it uses sophisticated cryptography protocols such as public key encryption; this is the method that distinguishes “https”from “http” in URL addresses and makes a site safe. Public key encryption is based on an interesting mathematical idea – the solvable problem that cannot be solved in our lifetimes even with the best algorithms running on the fastest computers; this is an example of the phenomenon mathematicians call “combinatorial explosion.” The receiver has created this set of problems or puzzles himself and so his private key gives him a way to decrypt the sender’s message. This impenetrability makes Bitcoin an example of a crypto-currency since transactions are decipherable only by the buyer and the seller. This feature clearly makes the system attractive to parties with a need for privacy and makes it abhorrent to tax collectors and regulators.
• New bitcoins are created by a process called mining that in some ways emulates mining gold. A new bitcoin is born when a computer manages to be the first to solve a terribly boring mathematical problem at the expense of a great deal of time, computer cycles and electricity; in the process of mining, as a side-effect, the BTC miners perform some of the grunt work of verifying that a new transaction can be added safely to the blockchain. Also, in analogy with gold, there is a limit of 21 million on the number of unit bitcoins that can be mined. This limit is projected to be reached around the year 2140; as is to be expected, this schedule is based on a clever strategy, one that reduces the rewards for mining over time.
The bitcoin can be divided into a very small unit called the satoshi. This means that $5 purchases, say, can be made. For example, using Gyft or eGifter or other system, one can use bitcoin for purchases in participating stores or even meals in restaurants. In the end, it is supply and demand that infuse bitcoins with value, the demand created by usefulness to people. It is easy enough to get into the game; for example, you can click HERE for one of many sites that support BTC banking etc
The future of Bitcoin itself is not easy to predict. However, digital currency is here to stay; there are already many digital currency competitors (e.g. Ethereum, Ripple) and even governments are working on ways to use this technology for their own national currencies. For your part, you can download the Satoshi Nakamoto paper, slug your way through it, rent a garage and start your own company.


At the very beginning of the 1600’s, explorers from England (Gosnold), Holland (Hudson and Block) and France (Champlain) nosed around Cape Cod and other places on the east coast of North America. Within a very short time, New England, New Netherland and New France were founded along with the English colony in Virginia; New Sweden followed soon. Unlike the early Spanish conquests and settlements in the Americas which were under the aegis of the King of Spain and legitimized by a papal bull, these new settlements were undertaken by private companies – the Massachusetts Bay Colony, the Dutch West India Company, the Compagnie des Cent-Associės de la Nouvelle France, the Virginia Company, the South Sweden Company.

New Sweden was short lived and was taken over by New Netherland; New Netherland in turn was taken over by the Duke of York for the English crown.

In terms of sheer area, New France was the most impressive. To get an idea of its range, consider taking a motor trip through the U.S.A. going from French named city to French named city, a veritable Tour de France:

Presque Isle ME -> Montpelier VT -> Lake Placid NY -> Duquesne PA -> Vincennes IN -> Terre Haute IN -> Louisville KY -> Mobile AL -> New Orleans LA -> Petite Roche (now Little Rock) AR -> Laramie WY -> Coeur d’Alène ID -> Pierre SD -> St Paul MN -> Des Moines IA -> Joliet IL -> Detroit MI

That covers most of the U.S. and you have to add in Canada. It is interesting to note that even after the English takeover of New Netherland (NY, NJ, PA, DE) in 1664, the English territories on the North American mainland still basically came down to the original 13 colonies of the U.S..

The first question is how did the area known as the Louisiana Territory get carved out of New France? Mystère.

To look into this mystery, one must go back to the French and Indian War which in Europe is known as the Seven Years War. This war, which started in 1756, was a true world war in the modern sense of the term with fronts on five continents and with many countries involved. This was the war in which Washington and other Americans learned from the French and their Indian allies how to fight against the British army – avoid open field battles above all. This was the war which left England positioned to take control of India, this was the war that ended New France in North America: with the Treaty of Paris of 1763, all that was left to France in the new world was Haiti and two islands in the Caribbean (Guadeloupe and Martinique) and a pair of islands off the Grand Banks (St. Pierre and Miquelon). The English took control of Canada and all of New France east of the Mississippi. Wait a second – what happened to New France west of the Mississippi? Here the French resorted to a device they would use again to determine the fate of this territory – the secret treaty.

In 1762, realizing all was lost in North America but still in control of the western part of New France, the French king, Louis XV (as in the furniture), transferred sovereignty over the Louisiana Territory to Spain in a secret pact, the Treaty of Fontainebleau. (For a map, click HERE) The British were not informed of this arrangement when signing the Treaty of Paris in 1763, apparently believing that the area would remain under French control. On the other hand, in this 1763 treaty, the Spanish ceded the Florida Territory to the British. This was imperial real estate wheeling-and-dealing at an unparalleled scale; but to the Europeans the key element of this war was the Austrian attempt to recover Silesia from the Prussians (it failed); today Silesia is part of Poland.

How did the Louisiana Territory get from being part of Spain to being part of the U.S.? Again mystère.

The Spanish period in the Louisiana Territory was marked by struggles over Native American slavery and African slavery. With the Treaty of Paris of 1783 which ended the American War of Independence, the Florida Territory which included the southern ends of Alabama and Mississippi was returned to Spain by the British. For relations between the U.S. and Spain, the important issue became free navigation on the Mississippi River. Claims and counterclaims were made for decades. Eventually the Americans secured the right of navigation down the Mississippi. So goods could be freely shipped on the Father of Waters on barges and river boats and the cargo could still pass through New Orleans before being moved to ships for transport to further locations. This arrangement was formalized by a treaty in 1795, know as Pinckney’s Treaty but one honored by the Spanish governor often in the breach.

The plot thickened in 1800 when France and Spain signed another secret treaty, the Third Treaty of San Ildefonso. This transferred control of the Louisiana Territory back to the French, i.e. to Napolėon.

Switching to the historical present for a paragraph, Napolėon’s goal is to re-establish New France in New Orleans and the rest of the Louisiana Territory. This ambition so frightens President Thomas Jefferson that, in a letter to Robert Livingston the ambassador to France, he expresses the fear that the U.S. will have to seek British protection if Napoleon does in fact take over New Orleans:

“The day that France takes possession of New Orleans…we must marry ourselves to the British fleet and nation.”

This from the author of the Declaration of Independence!! So he instructs Livingston to try to purchase New Orleans and the surrounding area. This letter is dated April 18, 1802. Soon he sends James Monroe, a former ambassador to France who has just finished his term as Governor of Virginia, to work with Livingston on the negotiations.

The staging area for Napoleon’s scheme was to be Haiti. However, Haiti was the scene of a successful rebellion in the 1790’s against French rule led by Toussaint Louverture leading to the abolition of slavery in Haiti and the entire island of Hispaniola by 1801. Napoleon’s response was to send a force of 31,000 men to retake control. At first, this army managed to defeat the rebels under Louverture, to take him prisoner, and to re-establish slavery. Soon, however, the army was out-maneuvered by the skillful military tactics of the Haitians and it was decimated by yellow fever; finally, at the Battle of Vertières in 1803, the French force was defeated by an army under Jean-Jacques Dessalines, Louverture’s principal lieutenant.

With the defeat of the French in Haiti at the hands of an army of people of color, the negotiations in Paris over transportation in New Orleans turned suddenly into a deal for the whole of the Louisiana Territory – for $15 million. The Americans moved swiftly, possibly illegally and unconstitutionally, secured additional funding from the Barings Bank in London and overcame loud protests at home. The Louisiana Territory was formally ceded to the U.S. on Dec. 20, 1803.

Some numbers: the price of the Louisiana Purchase comes to less than 3 cents an acre; adjusted for inflation, this is $58 an acre today – a good investment indeed made by a man, Thomas Jefferson, whose own finances were always in disarray.

There is something here that is reminiscent of the novel Catch 22 and the machinations of the character Milo Minderbinder (Jon Voight in the movie) : Barings Bank profited from this large transaction by providing funds for the Napoleonic regime at a point in time when England was once more at war with France! What makes the Barings Bank stunt more egregious is that Napolėon was planning to use the money for an invasion of England (which never did take place). But, war or no war, money was being made.

The story doesn’t quite end there. The British were not happy with these secret treaties and the American purchase of the Louisiana Territory, but they were too occupied by the Napoleonic Wars to act. However, their hand was forced with the outbreak of the War of 1812. At their most ambitious, the British war aims were to restore the part of New France in the U.S. that is east of the Mississippi to Canada and to gain control of the Louisiana Territory to the west of the Mississippi. To gain control of the area in question east of the Mississippi, forces from Canada joined with Tecumseh and other Native Americans; this strategy failed. With Napolėon’s exile to Elba, a British force was sent to attack New Orleans in December 1814 and to gain control of the Louisiana Territory. This led to the famous Battle of New Orleans, to the victory which made Andrew Jackson a national figure, and to that popular song by Johnnie Horton. So this strategy failed too. It is only at this point that American sovereignty over the Louisiana Territory became unquestioned. It can be pointed out that the Treaty of Ghent to end this war had been signed before the battle; however, it was not ratified by the Senate until a full month after the battle and who knows what a vengeful and batty George III might not have done had the battle gone in his favor. It can be said that it was only with the conclusion of this war that the very existence of the U.S. and its sovereignty over vast territories were no longer threatened by European powers. The Monroe Doctrine soon followed.

The Haitians emerge as the heroes of this story. Their skill and valor forced the French to offer the entire Louisiana Territory to the Americans at a bargain price and theirs was the second nation in the Americas to declare its independence from its European overlords – January 1, 1804. However, when Haiti declared its independence, Jefferson and the Congress refused to recognize their fellow republic and imposed a trade embargo, because they feared the Haitian example could lead to a slave revolt here. Since then, French and American interference in the nation’s political life have occurred repeatedly, rarely with benign intentions. And the treatment of Haitian immigrants in the U.S. today hardly reflects any debt of gratitude this nation might have.

The Haitian struggle against the French is the stuff of a Hollywood movie, what with heroic figures like Louverture, Dessalines and others, political intrigues, guerrilla campaigns, open-field battles, defeats and victories, and finally a new nation. Hollywood has never taken this on (although Danny Glover appears to be working on a project), but in the last decade there have been a French TV mini-series (since repackaged as a feature film) and other TV shows about this period in Haitian history.

The Barens Bank continued its financially successful ways. At one point, the Duc de Richelieu called it “the sixth great European power”; at another point, it actually helped the Americans carry out deals in Europe during the War of 1812, again proving that banks can be above laws and scruples. However, its comeuppance finally came in 1995. It was then the oldest investment bank in the City of London and banker to the Queen, but the wildly speculative trades of a star trader in the Singapore office forced the bank to fail; it was sold off to the Dutch bank ING for £1. The villain in the piece, Nick Leeson, was played by Ewan McGregor in the movie Rogue Trader.

In the end, Napoleon overplayed his hand in dealing with Spain. In 1808, he forced the abdication of the Spanish king Carlos IV and installed his own brother as “King of Spain and the Indies” in Madrid. This led to a long guerrilla war in Spain which relentlessly wore down the troops of the Grande Armėe and which historians consider to have been the beginning of the end for the Little Corporal.