Henry F Graff. Presidents: A Reference History. Editor: Henry F Graff. 3rd edition. Detroit: Charles Scribner’s Sons, 2002.
The creation of the presidency was one of the grand achievements of the Constitutional Convention that met in Philadelphia in 1787. Although the controversy between the large and small states regarding their representation in Congress was the first order of business, the delegates did not delay long in taking up the subject of the executive. Without hesitation they referred to the executive they were establishing as the president. The Constitution names as president the presiding officer of the Senate, but the appellation is understood today to belong alone to the chief executive of the United States. The word presidency to describe the office of president was already current in 1800.
The designation president had long been familiar to Americans; it was used in the colonies and then in the states to denote the chief executive, or chief magistrate, as he was often called. By 1800, though, in all the states the title of president had given way in popular usage to governor. It had not seemed remarkable that from the First Continental Congress in 1774 to the last session of the Second Congress in 1789, the chairman was “President of the Congress” and under the Articles of Confederation from 1781 on, “President of the United States in Congress Assembled.” In those fifteen years, fourteen men had held the title, because the term of service was fixed at one year. Perhaps the best remembered of these original presidents is John Hancock, whose famous signature is an ornament of the Declaration of Independence. Still, all of them were mere instruments of the congresses that chose them. Although they took on administrative duties of various kinds because there was no other agency to fulfill them, their powers were not defined; as delegates to the Congress they simply were first among equals. The idea of a separation of power between the executive and the legislature was not yet on the American political horizon.
Fashioning the New Office
At the Constitutional Convention, the first time the discussion of the executive came up, the delegates debated whether that office should be held by one person or be a plural office. One delegate argued that “unity in the executive magistracy” would be too risky. It would prove to be, he said, the “fetus of monarchy.” The specter of a tyrannical executive was well-grounded in the history of the American Revolution. Thomas Paine in his magisterial Common Sense, published in 1776, had affixed to George III the label “the Royal Brute of Britain.” The fear of a new brute was ineradicable in the public mind. So, how much power should the proposed executive have? A deeply concerned delegate unhesitatingly declared that he regarded “the executive magistracy as nothing more than an institution for carrying the will of the legislature into effect.” The legislature, he maintained, “was the depository of the supreme will of the people.” This view of a president as dominated by Congress swiftly lost favor among the delegates.
As the work of the Convention proceeded, one of the most influential in shaping the executive was James Wilson of Pennsylvania, a Scotsman who had emigrated to America as a young man and had ardently supported the break with the Mother Country; in 1776 he had signed the Declaration of Independence. Wilson had no fear that a new monarchy was in the making, for he was confident, he said, that republican instincts were too well rooted in the public mind. He was convinced that there must be a single magistrate who would give “most energy, dispatch, and responsibility to the office.” Wilson keenly favored also the direct popular election of the president, another idea that the Convention was not willing to accept. George Mason, a Virginian, was adamant: allowing the people to choose the president, he insisted, would be “as unnatural as it would be to refer a trial of colors to a blind person.”
A second shaper of the emerging presidency was James Madison of Virginia, who, holding the view that Congress could become as oppressive as George III had been, also argued for a strong executive. Such an officer would serve as a counterweight to the legislature which, experience had shown, did not shrink from exerting its power. Madison supported a single seven-year term for the president—yet another idea that was rejected.
Early in September, a Committee on Unfinished Business, chaired by David Brearley of New Jersey, added precise touches to the specifications for the new executive. The proposals were honed in a vigorous but not prolonged series of discussion. The term of office would be four years. The president would be required to be a “natural-born citizen” and at least thirty-five years of age. An electoral college—with the number of each state’s electors equaling the number of its congressional representatives plus its two senators—was devised for the election of the president, both to make election indirect and to balance the interests of the large and the small states. The electors chosen by the state legislatures in a manner determined by each state, would vote for two persons, not inhabitants of the same state. While this design gave the advantage to the large states it was assumed, as one delegate said, that “nineteen times in twenty” no individual would win a majority of the votes. The decision would then devolve upon the House of Representatives, where each state, regardless of the size of its delegation, would have one vote.
The electoral college has long been a contentious feature of the presidential elections. The opposition to it was especially clamorous after disputed canvasses, as those of 1824 and 1876 were, and that of 1888 when Benjamin Harrison (1889-1893) won in the electoral college despite receiving 100,000 fewer popular votes than Grover Cleveland (1885-1889; 1893-1897), his chief opponent, and that of 2000, when the Republican, George W. Bush (2001-), defeated Albert Gore although the Democrat received 540,000 more popular votes. In his inaugural address, Bush made no mention of how narrow his victory had been, but later conceded sardonically: “I wasn’t exactly a landslide winner.” Andrew Jackson (1829-1837), when he was finally in the White House after being bested in the electoral college in the election of 1824, proposed the direct election of presidents by the people in each of his Annual Messages to Congress. (He also wished to restrict presidents to one term of four or six years.)
When the Convention came to the end of its deliberations, the final phrasing of the finished document was referred to a Committee of Style that turned the work over to Gouverneur Morris of New York. Morris was the third major figure in the making of the presidency. A few years earlier he had had a large part in the writing of his state’s constitution, which provided for a strong executive. Among many delegates in Philadelphia, New York’s arrangement seemed an ideal model to follow. With respect to the presidency, Morris left standing the language already agreed upon, which is the heart of Article II of the Constitution: “The executive power shall be vested in a President of the United States of America,” with no qualifying statement whatsoever. The explanation was not far to seek. One delegate put it this way: “I do [not] believe the [executive powers] would have been so great had not many of the members cast their eyes toward General Washington as President; and shaped their Ideas of the Powers to be given a President, by their opinions of his Virtues.” The expectation that Washington, who inspired unbounded confidence in his integrity and probity, would fill the chair of president, likely spurred the convention to make the president commander-in-chief and to provide for a four-year term renewable without limit.
The Presidency at Work
Combining today the roles of head of government and chief of state, the president is the very symbol of the United States, its premier statesman, and its resounding voice in the family of nations. From the White House flow the major initiatives for domestic and foreign policies. As titular head of the party in power, the president is likewise the chief politician of the country, the designated unifier of the competing constituencies and regions. He is also incomparably the principal newsmaker. But what the world takes for granted about the stability of America’s presidential system of government was by no means assured when George Washington (1789-1797) took the oath of office on 30 April 1789. Its words spoken by him—and by presidents-elect every four years since—are a sample of republican simplicity, then an unfamiliar concept in the governance of nations. His hand on the Bible, the chief executive-to-be pledges: “I do solemnly swear (or affirm) that I will faithfully execute the Office of President of the United States, and will to the best of my Ability, preserve, protect and defend the Constitution of the United States.”
For many years, people the world over wondered how the United States would fare under a president, often referring to the remarkable provision and powers of its chief leader as the “American experiment.” From time immemorial, monarchies—many of them absolute in power—had been almost universally the preferred form of government. Few people, indeed, would have guessed that the success of the American presidency would become an inspiration for peoples throughout the world.
Washington had no wish to be a king and did not act like one, but republics were rare in history and there were no guides to how a president ought to conduct himself. Washington was acutely conscious of the novelty of his position as the first of its kind in the history of the world, and he understood that almost every step he took would set a precedent for those who filled the office after him. The public also was keenly aware that the country was breaking fresh ground. Even how to refer to President had to be decided. One Senator suggested: “His Highness the President of the United States of America and Protector of their Liberties”; another proposed: “His Elective Highness.” In the end the simple “Mr. President” seemed august enough.
The presidency has continued to grow in complexity along paths that the founding fathers could not have imagined. Washington altered the conduct of the office as laid out in the Constitution through the creation of the cabinet, a body of advisers who also head the administrative subdivisions of the government. The cabinet was in its way the equivalent of the council of state that Madison had proposed at the Constitutional Convention for the purpose of keeping a watchful eye on the president. The cabinet that Washington established was not a monitor above him but, on the contrary, being his appointees, the members were subservient to him.
The cabinet arrangement began when Washington directed that the heads of the executive departments—state, treasury, and war—meet in his absence from the capital city. They met in this way only once, but beginning in 1793 these officials along with the attorney-general began to meet with Washington periodically. James Madison, then a member of the House of Representatives, was the one who first referred to the group as the cabinet. Washington had in mind taking votes on public issues at the cabinet sessions in order to establish that he was not an authoritarian. But the conflict between Jefferson, the secretary of state, and Hamilton, the secretary of the treasury, was so intense that the idea died quickly.
Because the cabinet has no constitutional foundation, presidents have used it variously. It has grown in size as new federal departments have been created, and is accordingly unwieldy as a vehicle for making decisions. Dwight D. Eisenhower (1953-1961) found it useful for vetting policy proposals, but John F. Kennedy (1961-1963) regarded it as of little value. Some presidents have used it chiefly for political purposes: James K. Polk (1845-1849) filled its places with possible successors; Abraham Lincoln (1861-1865), on the other hand, filled it with four members of his party who had been his rivals for the nomination in 1860. When it was suggested to him that they would eat him up, he replied that they would eat each other up. By the end of the Civil War he was not consulting the cabinet at all, to the dismay of the members. This has increasingly been the history of presidential cabinets.
Presidents have come to rely on groups of intimates for their advisers. Andrew Jackson’s coterie of cronies, mostly journalists, were designated the “Kitchen Cabinet” because they met in the kitchen of the White House. Theodore Roosevelt’s (1901-1909) insider group of friends and agency heads was his “Tennis Cabinet” because they discussed public business on the White House tennis court. Franklin D. Roosevelt (1933-1945) had as his closest aides a “brain trust,” a designation describing the scholars, economists, and lawyers who played significant roles in shaping New Deal policies. In conducting the Vietnam War, Lyndon B. Johnson (1963-1969) brought together his principal associates for lunch on Tuesdays in the private quarters of the White House. This group has been referred to as the Tuesday Cabinet. George W. Bush relies heavily on a group of “insiders,” some of whom had their first national political experience in the administration of his father.
Even before Washington’s presidency ended, the rise of political parties had commenced to alter the workings of the office. Although Washington detested the idea of parties, two groups had arisen in response to the national problems his administration was addressing. One group, guided by Thomas Jefferson and James Madison, was called the Democratic-Republicans. They had support in rural parts of the North and South and on the frontier. The other group, which had its strength among business and manufacturing interests, designated themselves the Federalists. They were generally pro-England and the Jeffersonians pro-France; at the time, the French Revolution was convulsing Europe. Washington in 1793 issued a neutrality proclamation stating that the United States would be impartial in foreign relations, that the nation would take no sides. This position set the basic policy for presidents in their conduct of international affairs for the next century.
When Washington’s second term was coming to an end in 1796 he announced that he would not seek a third term. In tribute to Washington’s decision, the next two-term presidents, Jefferson (1801-1809) and Madison (1809-1817) both forswore a third term. Ulysses S. Grant (1869-1877) sought a third term in 1880, and Theodore Roosevelt ran again in 1912, but the tradition was not actually broken until 1940. That year Franklin D. Roosevelt sought and won a third term—and in 1944 a fourth term. Today the Twenty-second Amendment to the Constitution forbids the president from serving more than two terms. Ratified in 1951, it was a kind of posthumous rebuke to FDR. Both Ronald Reagan (1981-1989) and William Jefferson Clinton (1993-2001) felt frustrated by the provisions of the amendment.
The early presidents had been chosen by congressional leaders of the emerging parties in party “caucuses.” At first these decisions were made in secret meetings, then for a while openly. The supporters of Jackson, representing a new spirit in politics, railed against “King Caucus” as undemocratic. After the savage presidential contest of 1824, the day of the caucus system was over, to be succeeded by a unique American institution, the national nominating convention. It was first convened in 1831 by the Anti-Masons, a small, single-issue party that is remembered now only for that contribution. Today, delegates of the major parties gather quadrennially in June or July to select their standard-bearers. The Republican delegates that gathered in Philadelphia in 2000 were 2,066 strong, with an equal number of alternates. A few weeks later when the Democrats met in Los Angeles, they assembled 4,366 men and women and 610 alternates. Both parties had chosen their delegates in state primary elections, state conventions, and state committees of the parties.
The system has grown more complicated in our own time, owing to the fact that the choosing of the next presidential candidates begins almost as soon as Inauguration Day is over. Would-be presidents announce that they are throwing their hats in the ring: they establish what are called exploratory committees, which promptly metamorphose into fund-raising organizations.
The testing time is the primary season, which commences in January in Iowa and continues in other states until June. Invariably, however, most of the primary votes are cast before April, and the nominations are clinched long before the final primaries. The number of delegates that each state party sends to its convention depends on the party’s performance in the previous election, bonuses being granted for the number of the party members elected to Congress, the state legislatures, and the governor-ships. The delegates whom the Republicans at present send to their convention are pledged to vote for a particular candidate.
The Democrats’ delegates are divided in accordance to the way each state voted in the previous three elections. Seventy-five percent of them represent electoral districts; the rest are at-large. Party leaders and certain elected officeholders constitute an additional 15 percent of the total number. On top of this number, there are several hundred so-called superdelegates, chosen because they are high officials of the party or because of the importance of the elective office they hold. Superdelegates are un-pledged. The Republicans assign all the delegates of a particular state or congressional district to the candidate who wins the most votes in that region. The Democrats assign the delegates in proportion to the votes the candidates received in the primaries.
Some of the struggles that have taken place in conventions are legendary. The Democratic convention of 1844 produced Candidate Polk in a wild session, making him the first “dark horse” to run for the White House. Not even mentioned in the first seven ballots, he, rather than the better-known leaders of the party, was nominated on the ninth. Franklin Pierce (1853-1857) was not a candidate when the Democrats met in the summer of 1852, and his wife was firmly opposed to his becoming one. He was chosen on the thirty-fifth ballot. In 1880 James A. Garfield (1881) was not mentioned on the first ballot of the Republican convention. On the second to fifth ballots he had only one vote. His strength rose and fell on successive ballots; even on the thirty-fifth he had only fifty votes. But a sudden flood of support made him the nominee on the thirty-sixth. In 1912 Woodrow Wilson (1913-1921), the governor of New Jersey, was eventually nominated by the Democrats on the forty-sixth ballot.
The Democratic convention of 1924 was the monster of all conventions. The nominee for president, John W. Davis was finally named on the 103rd ballot, as people with radios—then new-fangled—listened intently for the result they thought would never come. Today, the convention has become somewhat of an anachronism. Franklin D. Roosevelt in 1932 was nominated by his party on the fourth ballot. Since then, every president has been nominated by his party on the first ballot. The last major party nominee for the presidency not chosen on the first ballot was Wendell L. Willkie, named by the Republicans on the sixth in 1940.
Presidential campaigns since almost the beginning have been a source of wonder and of public entertainment. The first campaign that displayed the vitriol and half-truths that are the stuff of today’s was that of 1800. John Adams (1797-1801) was battling for reelection against Thomas Jefferson. In the back-and-forth between their respective newspaper supporters (for only in print did the verbal fireworks take place) Adams was reviled as a fool and a tyrant, and once again as in 1796, as an “avowed friend of monarchy.” Jefferson was denounced as an atheist, a Franco-maniac, and sometimes simply as Mad Tom. He was, it was said, an infidel whose passions posed untold, indeed unimaginable danger to the republic.
The Federalists and the Democratic-Republicans acted as if they were foreign adversaries, constantly at each other’s throat in the public prints and using denunciatory language that even now is rarely used. No citizen of that day would have predicted that in their old age Adams and Jefferson would correspond with each other on the great questions of republican government in a body of letters that is today a national treasure. Despite the heat of the political struggles, voters have always seemed able to see beyond the rhetoric of the combatants and their followers. Notice, for example, that all incumbent presidents running for office during wartime have been reelected. Although James Madison was fiercely blamed for the War of 1812, which was denounced as “Mr. Madison’s War,” he was reelected in 1812. Lincoln won a second term in 1864, even though the outcome of the Union cause was still in doubt.
Personal assaults have never been absent from presidential canvasses. But they are usually ineffective. Despite a rumor of Jefferson’s amorous connection with his slave Sally Hemings that was circulated in 1804, the issue evaporated quickly and did not defeat its target in the election. (The story never died, however, and in the late 1990s DNA testing has confirmed that the descendants of Sally Hemings had a Jefferson ancestor, possibly Thomas.) As for Andrew Jackson in 1828, he was stung by the scorpion charge which had long shadowed him that Rachel, his dear wife, had lived with him in adultery. She had mistakenly assumed she was divorced from her first husband at the time that she and the hero of the Battle of New Orleans (1815) were married. That she and Jackson were remarried when the facts were revealed mattered not at all to the supporters of John Quincy Adams. Rachel Jackson’s death before her husband’s inauguration left Jackson bereft, and he never ceased to put the blame for it on his opponents’ scurrility. The story plainly did not weaken Jackson’s candidacy.
Martin Van Buren (1837-1841), when he ran for president in 1836, was said to be an illegitimate son of Aaron Burr—thus, a bastard sired by a traitor who had killed Alexander Hamilton in 1804. Lincoln in the election of 1860 felt the pain of hearing that he was an illegitimate child, too. Such personal calumniation reached new heights during the election of 1884. The Democratic candidate, Grover Cleveland, was said to be the father of a child born out of wedlock, and his opponents made the most of the story. Cleveland had not been sure he was the father of the little boy—it might have been the responsibility of a friend of his—but being a bachelor he took on the obligation of supporting the child and his mother. In the end, his determination to tell the truth, as he also urged his supporters to do, carried the day and won him the election. Nevertheless, the impertinent jingle “Ma, Ma, where’s my pa? Gone to the White House, ha ha ha” lingers in presidential history.
In the new century, possibly because Victorian moralism in America was at its zenith, coupled with big-power responsibilities and the sobering effect of World War I, Americans came to think of their presidents as high-minded, flawless men. For years salacious gossip ceased to be a part of presidential campaigns, although Wilson’s possibly adulterous trysts and the infidelity of Warren G. Harding (1921-1923) were well-known to intimates and to many newspaper reporters. Franklin D. Roosevelt’s longtime affair with Lucy Mercer Rutherford, which had commenced in 1917 when he was assistant secretary of the Navy, was not general knowledge until after his death in 1945. His relations with Margaret (“Daisy”) Suckley, a distant cousin, came to light only in the 1990s. John F. Kennedy’s promiscuous womanizing likewise played no part in his campaign of 1960.
The old restraint broke down during the 1980s, and today presidential privacy is no more. Bill Clinton’s affair with Monica Lewinsky, an intern in the White House, opened the way to his impeachment in 1998. His earlier entanglement with an Arkansas nightclub singer, Gennifer Flowers, was a featured story during his first campaign in 1992. And the lawsuit that a former Arkansas state employee, Paula Jones, brought against Clinton in 1994 for sexual misconduct (which he eventually settled) contributed heavily to the tabloidization of newspapers and other media, as well as the denigration of the presidency. Without apparent embarrassment, he and Hillary Rodham Clinton, soon to be the First Lady, readily discussed their marriage on national television during the campaign of 1992. A candidate’s life is now a more or less open book, with income tax returns and medical records and every kind of rumor and report quickly turned into a national—or even international—story.
The first presidents did not “campaign” in the way modern Americans think of a run for the White House. It was regarded as unseemly for a potential president to appear to be seeking the office. Indeed, the myth was general that the office seeks the man. Still, the ferocious jostling for political primacy is an essential ingredient of the office. Lincoln, for example, was pleased to be everybody’s second choice at the Republican convention of 1860 in Chicago. He rightly expected that none of his opponents was strong enough to carry the day and that the party would come to him. His appetite for the office was hard to hide; he had said with barely feigned zest: “The taste is in my mouth a little.” When Adlai Stevenson was approached for the Democratic nomination in 1952, his first response was, “Let this cup pass.” He was not posing as a modest man but rather shrinking from a contest against General Eisenhower, a sure winner.
The eye of the television camera combined with modern investigative reporting has made it impossible any longer to hide or cover up a candidate’s real or perceived shortcomings. Although Franklin Roosevelt suffered paralysis in his legs after contracting polio in 1921, and often had to be carried by Secret Service agents, the general public was unaware of his incapacity. Every effort was made to hide it from the public—in the belief, apparently, that FDR’s ability to govern might be deemed impaired. Yet today the public seems to have given up such taboos as would once have militated against presidential candidates. Adlai Stevenson, the Democratic nominee, who ran against General Eisenhower in 1952 and 1956, was the first candidate of a major party who was a divorced man, and Ronald Reagan was the first divorcé elected to the presidency. In neither case did the marital status of the man become an issue. But the fact that Democratic nominee Alfred E. Smith was a Roman Catholic had played a major part in his defeat in 1928. Only when John F. Kennedy, also a Roman Catholic, sought the office in 1960 did this so-called handicap no longer apply. In 2000, Democrat Al Gore chose Senator Joseph Lieberman, an Orthodox Jew, to be the nominee for vice president; Lieberman’s religion did not seem to be a deciding factor in the outcome of the election.
As the public began more and more to participate in the hoopla of naming a president, the campaigns became dramatic and picturesque—full of parades, songs, banners, and slogans. Some of the slogans are indelible markers in the nation’s history: “Tippecanoe and Tyler, too” (the Whigs in 1840 with William Henry Harrison for president); “We Polked ’em in ’44, we’ll Pierce them in ’52” (the Democrats in 1852); “McKinley and the Full Dinner Pail” (the Republicans in 1900); “He kept us out or war” (the Democrats in 1916 with Woodrow Wilson for president); “Keep Cool with Coolidge” (the Republicans in 1924); “I Like Ike” (the Republicans in 1952 with Eisenhower for president); “Nixon’s the One” (the Republicans in 1972).
Until the end of the nineteenth century the principals did not themselves take part in the noisy ballyhoo. Their surrogates and the party faithful did the dirty work. When there were issues at stake—as became commonplace after the Civil War—the candidates’ positions on them would be made known by letters published in the newspaper, or by the candidate announcing his support of the party platform. Between 1865 and 1900 controversy revolved chiefly around the tariff (should it be high or low?) and about the currency (should it be backed by silver or by gold or by both?). These hotly disputed questions were really old ones much debated in antebellum days, but now they masked others that did not enter presidential campaign discourse. Deep-lying concerns, such as the conditions of those who toiled in the fields and the swelling industrial centers, the slums in which millions were forced to live, the lack of supervision of the food and drug supply, and, not least of all, the treatment of people of color, including not only the freedmen but also Native Americans and people of Asian descent—none of these found any echo in presidential campaigns.
In office, chief executives have sometimes uttered words of complaint about the burden they bear. Jefferson came to call the presidency “a splendid misery.” “A bed of thorns,” was how John Tyler put it. “Dignified slavery,” was Andrew Johnson’s phrase. Beset by office-seekers, James A. Garfield declared: “My God! What is this place that a man should ever want to get into it.” Harry Truman labeled the executive mansion a “big white jail.” Grover Cleveland was the most candid when he confessed that being president is “a self-inflicted penance,” for nobody has yet been dragged kicking and screaming into the White House. Indeed, few presidents, excepting perhaps James Buchanan (1857-1861), in 1861 and Herbert Hoover (1929-1933), though fully aware that they occupy the White House only temporarily, can resist bemoaning their departure on moving day.
The Vice President
John Adams of Massachusetts, Washington’s successor in 1797, had been the Vice President. As such he was the first to suffer under the restrictions a vice president has generally confronted, lacking any stated duty except to preside over the Senate, and vote there only to break a tie. At the Constitutional Convention, Elbridge Gerry, who was destined to serve as vice president (1813-1814) under Madison, insisted there was no need for the office. Adams regarded himself as “a mere mechanical tool to wind up the clock.” Notwithstanding, he had seen, he wrote his wife, that he was invested with “two separate powers-the one in esse [in actuality] and the other in posse [in potential].” In actuality he was nothing; in potential he could be everything.
So it has been in the entire history of the office. Although Adams as president viewed Jefferson, his vice president, as “the first prince of the country, and the heir apparent to the sovereign authority,” Jefferson complained that no one was consulting him “as to any measure of government.” Charles G. Dawes, the vice president under Calvin Coolidge (1923-1929) told then Senator Alben W. Barkley, who later would be vice president under Harry S. Truman (1945-1953): “I can do only two things: one is to sit up here and listen to you birds talk, without the privilege of being able to answer you back. The other is to look at the newspapers every morning to see how the President’s health is!” Still, despite these derisive words, eight vice presidents, by office only men standing in the wings, have succeeded to the presidency.
Although he felt useless, Adams still holds the record for breaking tie votes (twenty-nine). By comparison, in his eight years as vice president under Ronald Reagan, George H. W. Bush (1989-1993) cast only eight such ballots, and Albert Gore, vice president from 1993 to 2001, only four. By a quirk of fate, it became Gore’s last official duty to announce from the chair in the Senate in January 2001 the electoral vote making the victor in the canvass just ended his opponent, George W. Bush.
When Jefferson and Aaron Burr were tied in the electoral college in the presidential election of 1800, the House of Representatives was called upon to choose between them. Since the Twelfth Amendment, added to the Constitution in 1804, electors must mark their ballots to show separately their choices for president and vice president. The result has been the choosing of a person not as a potential successor to the presidency but in order to bolster the chances of the head of the ticket. The office was shortly used to “balance” a presidential ticket with a politician from another part of the country. Thus Jefferson and Madison, both from Virginia, served with George Clinton of New York. Sometimes it is personalities that have been “balanced”: the sophisticated New Yorker Martin Van Buren, for instance, had as his running mate Richard Mentor Johnson of Kentucky, who boasted that he “was born in a cane brake and cradled in a sap trough.” Sometimes the choice has been used to neutralize the stance of the presidential candidate. Thus when William Jennings Bryan, ardent advocate of free silver, was nominated in 1896, the Democratic convention named Arthur Sewall of Maine, a bank president, to run with him. Sometimes the vice presidential nominee settles a political debt, as in 1932 when John Nance Garner of Texas was repaid for helping make Franklin D. Roosevelt the standard-bearer. In 1984 the Democratic nominee, Walter F. Mondale, in a bow to the rising tide of feminism, named Geraldine Ferraro, a congresswoman from New York, to join him at the top of the ticket. She became thereby the first woman designated by a major party for one of the two highest elective offices. General Eisenhower’s selection in 1952 of the young Richard Nixon to be his vice president owed something to Ike’s eagerness to blunt criticism that he was too old to be president. Bill Clinton’s naming of Al Gore as his running mate in 1992 was aimed in part at projecting an image of youth in the Democratic party leadership. Presidents who have served more than one term have sometimes changed their vice president. Of the fifty-four presidential elections since the first, only eight resulted in the reelection of the entire ticket.
In the nineteenth century Martin Van Buren was the only vice president who succeeded to the White House without the death of his president, Van Buren being elected in 1836. Today it is assumed that the vice president is “in the succession,” and from the outset of an administration he is an adviser and consultant to the president—the only other nationally elected official. Since 1960 it has been the practice for the presidential nominees to select the other half of their ticket, rather than leave the choice to the convention, and upon election to give the vice president a medley of tasks, mostly ceremonial but increasingly substantial. For example, President Jimmy Carter (1977-1981) insisted that Vice President Walter F. Mondale attend every policy meeting that he himself did. This arrangement took on new form under President George W. Bush (2001-) when his vice president, Dick Cheney, was given four offices in Washington to deal with the wide range of responsibilities laid upon him. These tasks made him seem a kind of prime minister, the most powerful holder of the second highest office in history. After the terrorist attacks of 11 September 2001, Cheney was reported to be performing his duties from a “secure and undisclosed location.” He and the president were hoping to ensure the continuity of government in the event that either became a victim of a terrorist attack. Cheney remained “the coach to Mr. Bush’s quarterback,” and through video conferences and frequent use of the telephone, Cheney kept a close watch on policies.
Presidents by Accident
Remarkably, the Constitution was in force for more than half a century before a president died in office. When William Henry Harrison (1841) succumbed to pneumonia a month after his inauguration, John Tyler (1841-1845), his vice president, immediately took the presidential oath and claimed his full rights as chief executive. No member of the Constitutional Convention was still living to say what the framers might have intended otherwise in such an eventuality. Like many contemporaries, John Quincy Adams, a former president (1825-1829), was appalled. He wrote in his diary: “I paid a visit this morning to Mr. Tyler, who styles himself President and not Vice President acting as President.” Efforts in Congress to argue that Tyler was somehow not entitled to exercise the powers of the office failed to carry the day.
Other “accidental” presidents have seemed to mean surprisingly good luck for the country. The second such was Millard Fillmore (1850-1853), vice president under Zachary Taylor (1849-1850), who had not even consulted him in selecting his cabinet. After Taylor’s death in office, Fillmore helped bring about the Compromise of 1850 that forestalled for a decade the looming disruption of the Union. When Lyndon Johnson succeeded to the presidency upon the assassination of John F. Kennedy in 1963, civil rights legislation that had languished in Congress was put on the road to enactment in a notable burst of presidential energy, by a man who had been a recognized master of the Senate when he was the majority leader. Two of the outstanding presidencies in the twentieth century, Theodore Roosevelt’s and Harry S. Truman’s, followed satisfactorily the demise of their principals. Yet the transition has not always been smooth. When James A. Garfield was critically wounded by a gunshot in 1881 and Vice President Chester A. Arthur met the next day with the cabinet, no one rose to greet him and they stared at him dumbly.
Several times the succession to the presidency after the vice president has been dealt with by legislation. A law passed in 1792 put the president pro tempore of the Senate, usually spoken of as president pro tem, next in line; a new law in 1886 removed the president pro tem from the line of succession and substituted the cabinet officers. President Truman in 1947 signed into law the Succession Act that remains in effect today. It places the Speaker of the House and then the president pro tem next after the vice president, to be followed by the secretary of state and the other cabinet officers in the order of their departments’ creation.
Finding the Candidates
The recruitment of presidents has changed over time. The first presidents were those who helped to found the nation. Washington’s role is memorialized forever as the “Father of His Country” and as “First in peace, first in war, and first in the hearts of his countrymen.” John Adams, the second chief executive, was already known to fame as the “Atlas of Independence” for his Herculean labors in the cause of liberty during the 1770s and 1780s. Then followed a group of luminous Virginians: Jefferson, the third president, was the principal author of the Declaration of Independence; Madison, Jefferson’s close friend, had earned the sobriquet “Father of the Constitution,” and James Monroe (1817-1825), who had been wounded as a soldier in the War for Independence, and was said to look like George Washington, is remembered as the “last of the cocked hats”—the last president who wore the tricorne of the Revolutionary era. Jefferson, Madison, and Monroe all had served as secretary of state—Jefferson under Washington, Madison under Jefferson, and Monroe under Madison. Consequently, the office of secretary of state came to be regarded as preparation for the presidency and its holder as next in line to the White House.
This pattern became a factor in the election of 1824 when Monroe’s second term in office was coming to an end. The three leading candidates were Andrew Jackson of Tennessee who won 99 electoral votes, John Quincy Adams of Massachusetts who won 84, and Henry Clay of Kentucky who won 37. Because none of them had a majority, the House of Representatives had to select the winner. Clay was in the position of president-maker because he could choose to throw his support to either Jackson or Adams. Clay found it congenial to side with Adams, and Adams was thereupon elected. Soon afterward Adams made Clay his secretary of state. The Jackson supporters were outraged, shouting “bargain and corruption.” They believed that with that appointment Adams had “paid off” Clay by putting him in succession to the presidency—in accordance with the established practice since the beginning of the office, with the single exception of John Adams’s presidency.
In his inaugural address, Adams admitted: “Less possessed of your confidence in advance than any of my predecessors, I am deeply conscious of the prospect that I shall stand more and oftener in need of your indulgence.” He got little of it: Jackson’s backers during the next four years waged a tireless campaign to avenge their man’s defeat. Many observers could see that the nature of politics was changing. The older aristocratic view that leadership is to be found in an elite class only was beginning to be supplanted by the notion that “the people” must rule. Whereas only about 20 percent of the eligible voters went to the polls in 1824, four years later the number had quadrupled.
The Jacksonians, maintaining that they were the true successors of Jefferson, called themselves Democratic-Republicans. Soon they shortened the name to Democrats. They touted Jackson as “the people’s choice” and elected him in 1828. He had once said, “I know what I am fit for. I can command a body of men in a rough way, but I am not fit to be President.” Nevertheless, on his inauguration day hordes of his partisans overran the White House in their certainty that a glorious new time for the nation was at hand. John Quincy Adams, still smarting that Jackson had not called on him to pay his respects after his election, refused to attend Jackson’s oath-taking. A few years later when Adams’s alma mater, Harvard College, bestowed an honorary doctor of laws degree on Jackson, Adams was incensed, calling his successor “a barbarian who could not write a sentence of grammar and hardly could spell his own name.” Harvard’s president might have been speaking for the emerging new outlook when he responded to Adams: “As the people have decided that this man knows laws enough to be their ruler, it is not for Harvard College to maintain that they are mistaken.”
The age of Jackson commenced a new time in the history of the presidency, for it was in the 1830s that what we call “public opinion” began to form and to become an element in national politics. “Public opinion” was not always easy to divine, but every vote-seeker knew that somehow it was the pulse of the electorate. More and more, politicians came to respond to the views of the people and to keep a wet finger to the wind.
Getting Out the Vote
In the era before the modern media changed the way the news was circulated, people would usually make their choice of president by following the decision of their local party chieftains. For the book-reading public there were the campaign biographies of the principal candidates.
This genre first appeared in 1824. In one way or another, all of the literary trumpeters made their subjects out to be “heralds of destiny”; some were also provided with exalted ancestries. In 1852 Nathaniel Hawthorne, a Bowdoin College classmate of Franklin Pierce, composed a campaign biography of his fellow-alumnus that was quickly printed in huge quantities. Five thousand copies were distributed free in New York City alone. The same year a million copies of a biography of Pierce’s opponent, General Winfield Scott, were said to have been handed out.
During the campaign of 1860 Lincoln was so little known in the eastern part of the country that newspapers sometimes simply referred to him as “the Westerner.” When he came to New York to deliver an address at Cooper Union Institute, he traveled uptown to the studio of the photographer Mathew Brady, where he had his picture taken. The Republican party thereupon distributed hundreds of thousands of copies of it, making it a “first” for millions of people: seeing what their would-be president looked like. Nevertheless, the circumstances of his life were made known through campaign biographies widely circulated. Another literary lion, William Dean Howells, was the author of one of the best of them, dwelling heavily on how Lincoln had educated himself and embedding that fact in the voter’s sense of the future Emancipator’s character and worth.
Yet another literary light, General Lew Wallace, best known for his novel Ben Hur (1880), wrote Benjamin Harrison’s campaign biography. And John Dos Passos, later the author of the acclaimed U.S.A. trilogy (1937), touted McKinley for reelection in 1900. In 1952, the popular writer John Gunther wrote a worshipful life of Dwight D. Eisenhower. When he was a candidate for the governorship of New York in 1928, Franklin Roosevelt wrote a campaign biography of Alfred E. Smith, the Democrats’ choice for president.
Before the Civil War, candidates were invariably revealed to have been paragons of virtue as children; after the 1870s Tom Sawyer provided the model boyhood-pranks and artfulness mingled with unmistakable good-heartedness. In the twentieth century candidates have also been depicted as outdoor types—Hoover, for instance, was portrayed as an avid fisherman. The Horatio Alger theme of “rags to riches” also had its place: thus Wendell Willkie, the Republican candidate in 1940, was reported to have peddled newspapers as a boy, James A. Garfield was presented as the “canal boy” working on the waterways of his native Ohio, and Eisenhower was said to have sold vegetables on the streets of Abilene, Kansas. These desirable attributes seem not to have been verified. One of Grant’s biographers affirmed that the general never drank anything “stronger than cold water.”
Boasting that the candidate was a farmer has always seemed advantageous. Franklin Roosevelt enjoyed contending that his Hyde Park homesite was really a farm. Voter-readers learned that Harry Truman had hardened his hands on the levers of a gang-plow. Garfield was depicted “working in the hay-field with his boys.”
But above all, military heroism has been the choice card of admission, for, like most peoples who applaud their war heroes, Americans have given their triumphant army leaders open sesame to the White House, from Washington to Jackson to Zachary Taylor to Theodore Roosevelt to Eisenhower. Sometimes the campaign biographer has had to find a substitute for the real thing. Thus it was recalled that William Jennings Bryan, the Democrats’ candidate in 1896, 1900, and 1908, trained a regiment in Nebraska during the Spanish-American War. Thomas E. Dewey, the Republican candidate in 1944 and 1948, was hailed as the successful chairman of the first U.S.O. fund drive, providing aid and entertainment to service men and women. And the Republicans called attention to Ronald Reagan’s World War II service, which consisted chiefly of making training films for the troops. Lord Bryce writing in 1880 in his classic commentary The American Commonwealth offered the judgment that what the party desires in not a good president but a good candidate. Voters must wrestle with the implications of this assessment every four years.
The campaign biography has become less important today because the media present the candidates in such intimate detail that no mystery attaches to them and no myth about them can be sustained. William Jefferson Clinton’s history of womanizing and his experiment with marijuana were the stuff of conversation well before he took the oath of office. George W. Bush’s need to give up alcohol was public knowledge and turned into a virtue when his campaign for the presidency was only beginning.
Today, biographies of the candidates issued in presidential years are sometimes searching and critical. Two of the kind in 2000 were David Maraniss and Ellen Nakashima’s The Prince of Tennessee: The Rise of Al Gore and Molly Ivins and Lou Dubose’s Shrub: The Short but Happy Political Life of George W. Bush. For many years, too, prospective presidents have forsaken the older style of seeming to be reluctant candidates and proclaimed their own merits for voter support in books usually put together hastily by ghost writers. The titles are indicative. Calvin Coolidge’s supporters aiming to make him a dark horse candidate in 1920 brought together a collection of his speeches entitled Have Faith in Massachusetts (1919). Likewise Richard Nixon (1969-1974) put out in 1960 a compilation of his speeches called The Challenges We Face. Barry Goldwater, the Republican candidate in 1964, published Where I Stand. Four years earlier he had turned out The Conscience of a Conservative. In 1964 Lyndon Johnson proudly distributed a little book entitled My Hope for America. Jimmy Carter attracted national attention in 1975 after he released Why Not the Best? The year 1988 saw the Democratic candidate, Michael S. Dukakis bring out Creating the Future: The Massachusetts Comeback and Its Promise for America. It was, in its way, a response to his opponent, George H. W. Bush, who had written (with Victor Gold) Looking Forward (1987). The title was identical to a book written by Franklin Roosevelt in 1933. The Republican candidate, Robert Dole, tried something fresh in 1996 when he made use of Jack Kemp, his popular vice presidential candidate, in Trusting the People: The Dole/Kemp Plan to Free The Economy and Create a Better America. Dole also produced a joint autobiography with his accomplished wife, Elizabeth, that they fetchingly called The Doles: Unlimited Partners (1988). In 1999, George W. Bush contributed to the species A Charge to Keep. Aside from being of interest to historians, these books and their kind quickly gather only dust on the shelves of libraries.
Rise of the Modern Media
Changing technology also played a part in creating a new political culture. Whereas in 1801 it was considered extraordinary that Jefferson’s inaugural address had appeared in a Washington newspaper almost immediately (he had given a copy of it beforehand to the editor), William Henry Harrison’s forty years later was distributed by railroad, and people in Philadelphia could read it the evening of the day it had been delivered in the capital. When James K. Polk was nominated in Baltimore in 1844, the news was received in Washington on the first telegraph line in the country. Before Polk’s term ended, the wire service the Associated Press had been created, linking major newspapers in their coverage of events. Already the penny press, begun modestly in 1833, was placing newspapers in the hands of “the common man” and opening the way to making the populace politically informed. By the time that the telephone was in widespread use in the 1890’s, news could be disseminated everywhere, and practically instantaneously, and presidential politics became one of its alluring staples. The new Linotype machine made it possible to set the stories quickly, supplying constantly updated editions of newspapers, heightening the interest in news, notably accounts of presidential activities.
In the 1890s the general use of the halftone method of reproducing pictures enabled magazine and newspaper editors to illustrate cheaply the articles they ran. This development allowed people at last to know what their president looked like. Up to then, for the most part, only woodcuts had been used, and they only occasionally. As a result, the majority of Americans only knew the faces of Washington and Lincoln. At the beginning of the twentieth century the rotogravure process improved the tonal quality of reproduced photographs, and the facial features of public figures became familiar to millions, especially in the big cities, through the Sunday supplements printed in sepia. For the first time the general public became accustomed to the looks and public doings not only of the president, but also of his family.
The first radio broadcast of election results was heard in 1920 on the Pittsburgh station KDKA. Four years later, moviegoers saw the presidential candidates for the first time in newsreels. In 1925 millions heard on the radio Calvin Coolidge’s inaugural address, another first. Franklin D. Roosevelt was the first president to be televised—at the New York World’s Fair of 1939. William Jefferson Clinton’s inauguration was the first that was broadcast live on the Internet. After World War II, television became the chief vehicle of presidential news, and the amount of money spent on presidential campaigns, mostly to pay for time on television, became a major factor and issue in national politics. Critics of the way this money was being raised argued that the sums contributed damaged the democratic process by constituting veritable bribes from interested donors.
Presidents and the Press
Presidents only gradually responded to the call and needs of the media. Their attentiveness has been in proportion to the growth of democracy which has more and more made it de rigueur for the chief executive to seem to be in tune with the voice of the people. But from early on, the press and the presidents have had a love-hate relationship. George Washington had considered himself a unifying force for the new nation, seeking to provide a standard “to which the wise and honest may repair.” The incendiary partisan newspapers of the day irked him, because they reflected the factionalism he despised. He canceled his subscription to thirty of them when he left his Virginia estate at Mount Vernon for his inauguration, although he was reading one on the last day of his life. On a tour of the South in 1791 he was followed about by reporters, experiencing what every president since then has known: an unquenchable limelight. In 1798 John Adams supported the Alien and Sedition Acts designed to silence his critics. And Jefferson, who spoke feelingly of the importance of freedom of the press, nevertheless could declare: “Newspapers present for the most part only a caricature of disaffected minds.” Although Madison was the author of the First Amendment, which guarantees freedom of the press, he felt wounded by the press’s carping about his conduct of the War of 1812. During the Mexican War, Polk regarded newspaper criticism as nothing less than treason. So it has been as presidents and writers for the media wrestle with each other like scorpions in a bottle.
Lincoln owed his nomination to two editors of Chicago newspapers; yet he punished editors of Confederate sheets. His relations with the press were often stormy, and cartoonists pilloried him relentlessly. Ulysses S. Grant, seared by the revelation of corruption in his administration, felt obliged to say as he closed his second inaugural address that from the time of his first campaign in 1868 he had “been the subject of abuse and slander scarcely ever equaled in political history.”
Grover Cleveland, a secretive man, was openly hostile to the press, too. In his day newsmen did not even have working space in the White House. They were forced to stand outside in all kinds of weather and hope to buttonhole visitors as they entered or departed. When a journalist asked the president to appoint a new secretary who might be good to news-papermen, Cleveland responded: “I have a notion to appoint a man who will be good to me.” Cleveland remains the only president who refused to attend the annual dinner of the Gridiron Club, the insider association of Washington journalists founded in 1885 where the president and the press attired in white tie and tails “singe but do not burn” each other with more or less good-natured sallies. His successor, William McKinley (1897-1901), had the same wariness toward the press. Talking to journalists a few days after having spoken before a gathering of patent experts, he said wryly: “This is the second time that I have been called upon this week to address a congress of inventors.”
Theodore Roosevelt opened yet another new day in the history of the presidency, one in which the president is the head of his party as well as chief executive. TR had a press secretary and one of the president’s closest friends was the Kansas newspaperman William Allen White. Roosevelt had learned early that self-promotion was an indispensable tool of the modern White House. He often talked to newspaper-men while he was being shaved in the morning. Woodrow Wilson held the first press conference as it is known today—eleven days after his inauguration in 1913. Suggested by his press secretary, Joseph P. Tumulty, it was attended by about 125 newsmen. Previously only favored journalists had had access to the president. The questions were submitted in writing. Wilson himself chose when to hold these sessions and would not yield to a demand for them, even in the unlikely event that journalists should make such a call. Still, like his predecessors, Wilson was convinced that newspaper reports were not trustworthy.
Despite all, the public continued to rely on newspapers for judging their leaders. Horace Greeley, the editor of the New York Tribune in Lincoln’s day, was nominated by the Democrats and the Liberal Republicans in 1872 and received substantial support in the election. In 1920 the two major-party candidates were active newsmen: Warren G. Harding for the Republicans was the editor and publisher of the Marion (Ohio) Star, and James M. Cox for the Democrats, also from Ohio, was the editor and publisher of the Dayton Daily News.
Presidential suspicion of journalists persisted none the less: Herbert Hoover, for instance, continued to hold press conferences in what was then the usual way, taking questions submitted in writing. Under Franklin D. Roosevelt, press conferences became less formal as journalists gathered around the president who was seated at his desk. At ease with the press (he maintained he was a journalist himself, claiming the status by virtue of having served on the Harvard Crimson as an undergraduate), he often teased his interrogators. And those whom he regarded as wrongheaded or otherwise irritating to him he consigned to his “Dunce Club.” Even so, it was still forbidden to quote the president directly. Blessed with a mellifluous speaking voice, FDR gave “fire-side” chats on the radio, which was becoming an everyday appliance in millions of homes. These talks are remembered today as a hallmark of his administration, allowing him to go over the heads of the print media and giving him unique access to the public mind.
Harry Truman, considerably less effective as a public speaker than his predecessor, was certain that newspaper editorials did not reflect popular opinion. He could feel vindicated when, despite almost universal predictions by the media that his Republican opponent, Thomas E. Dewey, would win the presidency in 1948, he was elected in his own right to a term in the White House. Dwight D. Eisenhower allowed his words at press conferences to be quoted directly and tape recordings of them to be released. On 19 January 1955 his news conference was recorded on television and on movie film—a ground-breaking event. Ike began by saying: “Well, I see we are trying a new experiment this morning. I hope it doesn’t prove to be a disturbing influence.” On the fortieth anniversary of Eisenhower’s graduation from West Point, his address to the class of 1955 was tele-cast in color—another first for a president.
The presidential debates between competing candidates beginning in 1960, when Vice President Richard M. Nixon and Senator John F. Kennedy went toe to toe, inaugurated a now expected feature of campaigns. The electorate wants to judge not only what the opponents are espousing but also their demeanor and facial expressions. When Kennedy became president, his ready wit made some of his press conferences entertaining as well as informative, and he scheduled them for evening hours. As talk radio and talk television began to fill the airwaves, candidates and presidents took advantage of the opportunities to deliver their message.
New Uses of the Media
One of the first to use talk television was George H. W. Bush when he was vice president. With unusual skill Bill Clinton took advantage of his articulateness to go one-on-one with ordinary citizens at “town meetings.” During his first campaign in 1992, Clinton’s playing of a saxophone on a late-night television show appeared to make a favorable impression on many voters by thus dramatizing the message that he belonged to a new generation prepared to innovate. Others were aghast and insisted that such a performance was undignified—not recalling that President Truman once played the piano in the White House with Lauren Bacall, a famous actress, sitting on the lid displaying her shapely legs. In recent campaigns, candidates, especially from the major parties, have arranged to meet so-called focus groups to probe and keep in touch with public opinion. And the constant polling of the citizenry on questions large and small has raised “public opinion” to new heights of importance, even as it appears to diminish the function of the president as “leader.” Modern press secretaries have become skilled in massaging the president’s words to give them the best possible slant for public consumption. The press secretary is now known informally and critically as “the spinmeister”—the master interpreter.
Most Americans may be only dimly aware that before each press conference presidents are exposed to “dry runs” at which they and their staff collectively attempt to anticipate the issues on the minds and tongues of the media questioners. At these sessions the president and his people fashion answers they deem appropriate—including humorous sallies—hoping thereby to furnish the public with a clarifying or, from time to time, an obfuscating proposition. Where presidents seek to manipulate the media to their advantage, the media representatives strive constantly to draw “a story” out of the chief executive’s words. The growth of “24/7” news distribution—twenty-four hours a day, seven days a week—has made the thirst for breaking news unquenchable.
Presidential Ghost Writers
By the same system, presidential speeches are rarely the work of the president himself. George Washing-ton’s celebrated Farewell Address, which was so influential during the long period of American isolation from world affairs, was the work mostly of Alexander Hamilton, the secretary of the treasury. When James K. Polk asked Congress for a declaration of war against Mexico in 1846, his words were written by Secretary of the Navy George Bancroft, the most distinguished American historian of the time. Years later Bancroft was again the presidential amanuensis, this time of Andrew Johnson. Understandably, being able to write well is not an ordinary requirement in a chief executive. The most accomplished penman among all the presidents was Theodore Roosevelt, who earned substantial royalties for his works of history.
The last president who wrote his own speeches was Woodrow Wilson, using a typewriter that he had used as a productive scholar in earlier years. Franklin Roosevelt leaned heavily on the poet Archibald MacLeish, the playwright Robert Sherwood, Judge Samuel I. Rosenman, and Harry Hopkins, who was often called the president’s alter ego. Hopkins wrote FDR’s third inaugural address. Still the many drafts of some of Roosevelt’s speeches extant in the FDR Library at Hyde Park, New York, show how much the phrasing was in fact the president’s. Roosevelt himself was the author of his powerful “day of infamy” speech delivered before Congress the day after the attack on Pearl Harbor on 7 December 1941.
Speechwriters often are the men and women who have the “passion for anonymity” that Franklin Roosevelt hoped to find in his intimates. Their identity becomes known, though, and deserves to be, for they are the makers in a substantial way of the nation’s patriotic slogans and political maxims. Eisenhower, who was a superior writer himself, leaned nevertheless on several helpers including Edward Mead Earle, an historian who labored on Ike’s first book, Crusade in Europe (1948), a memoir of the conquest of Nazi Germany. As chief executive, Eisenhower relied on a team, as all recent presidents have done. His included particularly Emmet John Hughes of Life magazine, and C. D. Jackson, a former editor of Time. Eisenhower’s Farewell Address in January 1961 contained his memorable warning against the corrosive influence of the “military-industrial complex.” The text was substantially the work of Malcolm C. Moos, a political scientist and newspaper editor, and a friend of Eisenhower’s brother, Milton, then president of Johns Hopkins University. Theodore Sorensen, a Nebraska-born lawyer, was the principal author of some of President Kennedy’s best speeches including his distinguished inaugural address. Kennedy’s book, Profiles in Courage (1955), which earned him a Pulitzer Prize and was so influential in helping enlarge his reputation on the eve of his campaign for the presidency, was the product of skillful ghostwriting by Sorenson and others.
Richard Nixon was admirably served by William Safire, who later became a widely read political columnist for the New York Times. Presidents Reagan and George H. W. Bush used the talent of Peggy Noonan as their writer. Noonan was responsible most notably for Bush’s promise not to raise taxes—”Read my Lips!”—that may have helped cost the president the election of 1992. George W. Bush has the services of Michael Gerson and Susan Hughes, who like their recent predecessors, have the ability to mesh their own style of writing with the speaking cadences as well as the thoughts of their principal.
None of these facts should suggest that the president is a ventriloquist’s dummy. All the recent presidents have worked over the drafts submitted to them for important speeches so that when the finished product becomes public the president can say, in most cases, it is his own. The public was recently surprised to learn that despite the general impression that he delegated the task, Reagan often prepared even first drafts of his own speeches. The president must deliver so many talks today and meet so many foreign guests who often are accorded state dinners that he could not possibly spend his time researching appropriate remarks, greetings, and toasts that have to come from the head table. Assistant speech-writers draft these ceremonial comments.
The country has been notably unable to make use of its former chief executives. The day a president leaves the office his executive power is gone and he becomes a has-been overnight. He reappears at the White House only at the invitation of the incumbent. The most publicized return of an ex-president followed Kennedy’s call to Eisenhower to join him at Camp David, the presidential retreat in the Catoctin Mountains of Maryland, after the failure of the ill-conceived Bay of Pigs invasion of Cuba in 1961. New in office, Kennedy was clearly hoping to have not only Eisenhower’s counsel but also the prestige of one of the most popular presidents on his side. Herbert Hoover, however, did not set foot in the Oval Office throughout the twelve years that Franklin Roosevelt, his successor, occupied it. In 1945, when Harry Truman, who had succeeded FDR, invited Hoover, who happened to be in town, back to the White House, Hoover wept with amazement and pleasure.
Jefferson in retirement became famous as the Sage of Monticello occupied with the planning and the building of the University of Virginia. James Monroe in his post-presidency spent much of his public energy in lawsuits against the federal government. He claimed real and imagined expense-money as due him for his services abroad as a diplomat. Although Congress twice voted him substantial sums, he continued to feel short-changed. After an unhappy presidency full of frustrated plans and hopes, John Quincy Adams returned to Washington as a member of the House of Representatives and served with distinction for seventeen years. He earned there the reputation of “Old Man Eloquent,” the most insistent antislavery voice in Congress. John Tyler at the request of the Virginia legislature accepted the chairmanship of the “peace convention” in Washington in 1861, seeking to find a way out of the secession crisis. When the Senate ignored the convention’s proposals, Tyler became himself a secessionist and won election to the Confederate House of Representatives.
Retired presidents, despite the variety of their pursuits, have shown that exercising power leaves an addiction that appears to be irreversible. In a now-famous quip, Harold Ickes, who served in Franklin Roosevelt’s cabinet, said: “When a man has been bitten by the presidential bug, he begins to suffer from a terrible disease that is only cured by embalming fluid.” President Grant, following an around-the-world tour in 1879, wanted to run again in 1880. Failing in his goal, he spent his last years writing what would prove to be the most widely sold book ever written by a president—his remarkable Memoirs, still in print today. They cover not his terms in the White House but his years on the battlefields of the Civil War. Theodore Roosevelt, who was only fifty years old when he left the presidency in 1909, had expressed the hope even as he entered the office eight years earlier that he would not be “a loose cannon on the deck” when his term of service was over. But he spent the next years working to win the presidency again. One of his opponents was William Howard Taft (1909-1913), recently his closest friend, whom he had turned into a resolute foe. Taft had said privately in the run-up to the election: “If you were to remove Roosevelt’s skull now, you would find written on his brain ‘1912.”’ The winner that year was the Democrat, Woodrow Wilson. Shortly Roosevelt was harassing him, too, maintaining that Wilson was not sufficiently active in countering Germany’s violation of American rights on the high seas. Soon after Wilson’s reelection in 1916 and the entry of the United States into World War I, Roosevelt applied to Wilson—in vain it proved—for a commission to lead an American army into Berlin.
Taft in his after-presidency gave a series of instructive lectures on the office at Columbia University, published as Our Chief Magistrate and His Powers (1916). Yet his heart and mind were bent on returning to the judicial bench where he had once sat. Indeed, in Taft’s days in Roosevelt’s cabinet, the president, speaking in the manner of a fortune-teller had one evening teased him and Mrs. Taft: “I see a man standing before me weighing about 350 pounds. There is something hanging over his head. I cannot make out what it is. At one time it looks like the presidency—then again it looks like the chief justiceship.” With unfeigned glee Mrs. Taft shouted: “Make it the presidency.” “Make it the chief justice-ship,” responded Taft in a quiet voice. In 1921 President Harding appointed Taft to be chief justice. Taft is still the only man to have served in two of the three highest offices in the land. And drawing on the immense prestige this singular fact appeared to entitle him to, he continually irritated Harding and Harding’s successor, Calvin Coolidge, with suggestions respecting vacancies to be filled on the Supreme Court.
Hoover, in his long post-presidency, rendered valuable service on two national commissions he headed to streamline the operation of the federal government. Nixon, the only president forced to resign his office, spent his last years writing not only his memoirs but also a series of works on foreign affairs. They were planned both as instruction to the public and to help rehabilitate their author’s reputation by highlighting what he regarded as his expertness on international politics. Carter kept a high profile as an ex-president. Having created the Carter Center in Atlanta, a nonprofit organization to promote peace and human rights, he took on various projects including helping to encourage and supervise elections in other countries. Through the Jimmy Carter Project of Habitat for Humanity International, he and his wife, Rosalynn, helped construct houses in New York City slums.
In the television era, former chief executives do not disappear from the public eye, as Truman did. They may be seen every so often on talk shows and occasionally they are rounded up as a group—as, for instance, on the occasion in November 2000 of the two hundredth anniversary of the White House. With the exception of Ronald Reagan, all of the living former presidents were prominently present at the memorial service in Washington’s National Cathedral for the victims of the 11 September 2001 terrorist attacks in New York City, Washington, D.C., and the hijacked airplane that crashed in western Pennsylvania.
Past presidents are in demand as public speakers, not so much for what they have to say, but as glamorous attractions at business and charitable gatherings. Reagan was widely criticized for accepting $2 million for two appearances in Japan, soon after he left the White House in 1989. But the practice of “cashing in” is now accepted. The commander-in-chief in his post-presidency is recognized as a celebrity-in-chief. George H. W. Bush earns around $4 million annually in fees for about fifty such appearances. Even before he left office, Bill Clinton was being booked for talks for as much as $150,000 apiece.
Illness and Disability
Presidential illness since the beginning has been a factor in the life of the nation. Nevertheless, the health of presidents until relatively recently was rarely spoken of; it was regarded as nobody’s business but the president’s. Moreover, even after the advent of modern medicine, the unspoken proscription against speaking about the body in Victorian America played a role in keeping off limits the details of the chief executives’ physical condition. Consequently, facts are known today that contemporaries were unaware of. George Washington, for example, was barely in office in 1789 when he developed a painful carbuncle on his left thigh. He may have been near death from the staphylococcal infection and high fever that accompanied it. His intense suffering, which in the end included the surgery he had to endure without anesthetic, necessitated rebuilding the carriage he traveled in so that he could lie in it at full length. The following year he contracted a cold, which, it was said, then turned into influenza and pneumonia—bringing despair for his recovery. We may only speculate on how the national history might have been different if the Father of His Country had died on either occasion: the first when the government was being launched under his hand as the indispensable man, and the second when the critical struggle over the first Bank of the United States was being played out.
When Andrew Jackson came to office in 1829 heralded as formidable, indestructible Old Hickory, he was in truth a debilitated man. He was still feeling the effects of a pistol shot long before lodged in his left shoulder, and suffering perpetually from intestinal bleeding, possibly caused by the calomel he took for his recurrent dysentery. We can never know how his frailty affected his performance as Chief Executive.
Presidents Polk and Truman, even though they served a century apart, have sometimes been compared as unexpected presidents who showed themselves to be feisty leaders forced to take the country into war. Both had been sickly children. Polk suffered as a boy from a bladder stone, eventually removed by surgery, that robbed him of a normal childhood. He was proud that his career as president and war leader proved he was no longer, as he once called himself, “the meager boy, with pallid cheeks, oppressed and worn with disease.”
Truman, too, endured a boyhood blighted by illness. At the age of eight, paralyzed by the effects of diphtheria, he had to be wheeled around in a baby carriage. Amply coddled, he became, simply stated, a sissy. Indeed, he liked to believe that he could arrange his sister’s curls better than his mother could. But he was determined to be manly. Growing up he set his heart on winning an appointment at the Military Academy at West Point, but this ambition was frustrated by his “flat eyeballs” (his own designation). In World War I he showed his mettle as an officer in an artillery unit and this service in uniform had an abiding influence on his political life. As president he demonstrated enormous respect for military men, including especially Generals Mark Clark, George C. Marshall, and Dwight D. Eisenhower.
The question of how far the president’s health ought to come officially and contemporaneously to the public’s knowledge did not trouble the country in earlier days. When, for instance, word leaked out in the 1880s that President Chester A. Arthur was suffering from Bright’s disease, a usually fatal kidney ailment, the White House silenced public speculation by denouncing the story as malicious gossip. A notable case occurred in 1893, shortly after Grover Cleveland’s second inauguration. He had begun to suffer from a lesion in his mouth that was soon diagnosed as a cancer, requiring immediate attention. Apart from the president’s personal stake, the political stakes were enormous. Cleveland was a committed defender of the gold standard; if he should die, his vice president, Adlai E. Stevenson of Illinois (a grandfather of the Adlai E. Stevenson who twice ran unsuccessfully for president against Dwight D. Eisenhower in the 1950s), who was a fervent advocate of the free silver policy would be president. Because it would be risking exposure to take Cleveland to a clinic or hospital, the work was done aboard a friend’s yacht anchored in New York’s East River near Bellevue Hospital. The medical staff was ordered to stay out of sight lest they be recognized by Bellevue’s resident doctors. To keep the president steady during the procedure, as the boat sailed slowly up the river, his chair was lashed tight to the mast. To avoid doing external surgery, one of the doctors, William W. Keen, a Philadelphia man, who had served in the Civil War and had studied abroad, employed a cheek retractor he had brought home from Paris in 1866. Operated on a second time a few weeks later, the president was fitted with a prosthesis that did not show on his face. The public was none the wiser until 1917—almost a quarter of a century later—when Dr. Keen, in an article in the Saturday Evening Post, finally broke the embargo on the story.
Although one in four presidents has been disabled at some time during his term of office, the disability of a president was not dealt with appropriately even after Woodrow Wilson was stricken by a massive stroke in 1919 that severely affected his gait and speech. His wife, Edith Galt Wilson, screened his mail and the list of his visitors, and is sometimes referred to, therefore, as the “first woman president.” Wilson’s medical history, had it been known in 1912 when he ran for the White House the first time, would have raised a flag of caution. The public was unaware that he had been suffering strokes since 1898. Many Americans will always believe that although the nation as a whole was bent on being quit of Europe after the end of the World War I, the failure of the Senate to ratify the Treaty of Versailles was in some measure related to the personality deficits the president had undergone.
A generation later, Franklin D. Roosevelt, aided as Wilson had been by a White House doctor willing to cover up what he knew, was an ailing man even as he presented to the world his smiling, confident face. At the end of 1943 when the Allied landings in Normandy were being planned, the president was suffering acutely from hypertensive congestive heart disease, and hypertension that resisted efforts at reduction. The medical people in the president’s entourage well knew that by D-Day in 1944, the president was barely able to concentrate on affairs of state. When he traveled to the Pacific to visit the American commanders and ostensibly to lay plans for the final assault on Japan later in the year, he was actually seeking surcease from the daily cares of his office.
Although there was much gossip in 1944 when FDR ran for a fourth term that he was mortally ill, the voters, ignorant of the truth, elected him handily. During the canvass, though, when the invasion of France was in its critical moments, the president at a private lunch with his running mate, Harry Truman, urged him for his own safety and for the good of the country, to avoid campaigning by airplane. “This time, we may need you,” the president told him presciently. Still, talking to the press after this portentous meeting, Truman offered traditional words of reassurance, saying that he found the president well and hearty. Not until 1970 was a full account of FDR’s medical condition made public—in an article in the Annals of Internal Medicine, a medical journal, by Dr. Howard Bruenn, the young naval aide who had been called in to treat the president.
After Eisenhower suffered a heart attack in 1955, he and Vice President Nixon came to an informal agreement that the vice president would take over the responsibilities of the presidency in a comparable emergency in the future. In the next years first when Eisenhower underwent bowel surgery and then after he suffered a “brain spasm” there was, he would say, “a gap when I could not carry out the duties of my office.” The openness of the Eisenhower administration in reporting on the various ailments the general came down with while in the White House, was no example for the Kennedy administration. It followed the style of the Cleveland and Wilson administrations. President Kennedy suffered, beginning when he was thirty years old, from adrenal insufficiency, or Addison’s disease, a fact confirmed by the autopsy performed after his assassination, but kept secret at the behest of the Kennedy family. When Kennedy underwent back surgery in 1954 because of his disease, he received the last rites of the Roman Catholic Church. Kennedy had kept fit by taking regular supplements of cortisone and similar drugs in replacement of the adrenal hormone. Still, victims of the disease taking cortisone and its ilk are subject to mood swings and stomach inflammation, including ulcers. Moreover, the face is sometimes made fuller by the medicines.
In his quest for the nomination in 1960, Kennedy had declared himself “the healthiest candidate for President in the country.” This was a backhanded reference to his opponent, Senator Lyndon B. Johnson, who had suffered a severe heart attack in 1955. When the truth about JFK’s medical condition was finally revealed in the Journal of the American Medical Association in 1967, many people declared that if it had been known in 1960, it is doubtful that he would have been nominated, let alone elected.
Johnson’s heart condition was constantly on his mind. He liked to say that he had had “the worst heart attack you could have and still live.” He confessed that every time he passed Wilson’s portrait, he trembled at the thought of himself lying helpless in the White House. He saw to it that there was defibrillation equipment on every floor, and he carried with him a copy of his electrocardiogram for emergency reference. The frenetic way in which he managed the Great Society legislation led some people to conclude that he felt instinctively he had no time to lose. Johnson and House Speaker John McCormack, who was next in line of succession, came to an agreement similar to the Eisenhower-Nixon arrangement. When President Reagan underwent surgery for colon cancer in 1985, he temporarily transferred the powers of his office to Vice President George H. W. Bush.
At last the Twenty-fifth Amendment, ratified in 1967, aimed at dealing officially with the vexing matter of presidential and vice presidential succession and disability. It provided formally for the first time that when death or resignation removes a president, the vice president becomes president. When a vice president is similarly removed, the president will choose a successor who takes office at once upon confirmation by a majority vote of both houses of Congress. And when a president writes to the president pro tem of the Senate and to the Speaker of the House that he is unable to perform the duties of his office—and until he informs them otherwise—the vice president becomes acting president. Similarly, if a majority of the cabinet (or of any other body that Congress designates) declares that the president cannot discharge his duties, the vice president becomes acting president. When the president declares that he is able to resume his office, he must so inform the president pro tem of the Senate and the Speaker of the House. If there is disagreement as to whether he is so able, Congress under specific time restraints must respond appropriately. In recent elections it has been common for candidates to issue medical reports on their physical condition, but these are not always complete.
The constant presence of the presidents in the public eye has also generated a continuing interest not only in First Ladies, but also in the children of presidents. The offspring have played the role of stand-ins for the princes and princesses of monarchies and fairy tales. Indeed, Alice Roosevelt, the child of Theodore Roosevelt and Alice Lee Roosevelt, his first, late wife, was often referred to in the press as Princess Alice, her name given to a popular shade, labeled Alice blue. She wrote later: “I was the daughter of an enormously popular President and I looked upon the world as my oyster.” Other First Daughters have not fared as well. Molly Garfield, just turned fourteen years old in 1881, was only in the White House eight months when her father succumbed to an assassin’s bullet. And Gerald Ford’s (1974-1977) daughter Susan lamented that living there was like living in a “cross between reform school and a convent.”
Presidential children are treated with deference when they are young and regarded as extensions of their parents when they are older. The son of President Kennedy and Jacqueline Bouvier Kennedy, known affectionately to the public as John-John, became world famous as a mere three-year old in a picture published around the world that showed him saluting the casket of his martyred father. Years later John junior’s death in an airplane crash in 1999 evoked national mourning. The one presidential son who as a boy may have altered American history was Robert Todd Lincoln. Having failed the entrance examination to Harvard College in fifteen out of sixteen subjects, he was enrolled in Phillips Exeter Academy in New Hampshire in order to “bone up.” Eager to visit and encourage him there, his father came east in 1860, lured by a fortuitous invitation to address the Cooper Union Institute in New York City for a fee of $200 and expenses. That address became a key factor in spreading Lincoln’s fame and making possible his nomination for the presidency the following month.
Presidents and the First Ladies have all kept keenly in mind the difficulties that their prominence creates for their sons and daughters. But for her father’s presidency, Jenna Bush’s brush with the law involving an underage drinking violation in Texas in 2001 would not have been public knowledge. Several first families have made earnest efforts to shelter their children not only from public exposure but also from the effects of the flattery and luxury that flow toward them. Chester A. Arthur notably tried to shield his daughter Nell from the world. The Kennedys hoped that the media would keep the spotlight off their daughter, Caroline. The media were respectful of the similar strong wishes of Bill and Hillary Rodham Clinton for their only child, Chelsea.
Still, the doings of children in the White House especially in times of crisis have added to the president’s appeal by making him seem ordinary even though the public expects him also to be extraordinary. During the Great Depression, Sistie and Buzzie Dahl, the children of the Roosevelts’ daughter, Anna, and her husband, came to be national personages. Even during the agony of the Civil War, the Northern public was amused to hear that Tad Lincoln, only eight years old in 1861, had led a team of goats through the executive mansion. And a generation later, Theodore Roosevelt did not object when his lively children introduced a pony into the Blue Room and walked on stilts across the elegant broadloom carpeting. Amy Carter, just turned nine years old when her father became president, “slept over” occasionally in the tree house her parents had had constructed for her on a White House lawn.
No one can seriously believe that the children of presidents serve as models for the nation’s young people. Still, their doings are constantly on display. And they think of themselves as a group with a special bond. Susan Ford said recently “I know when I run into Luci [Johnson] and Lynda [Johnson] and Julie [Nixon] and we all go, ‘And do you remember this?’ there’s only a few people who realize what it’s like.” In 1959 the largest gathering of the offspring of presidents took place at a “Life with Father” luncheon hosted by the Women’s National Press Club in celebration of its fortieth anniversary. Afterward the Eisenhowers received the invitees at the White House. Nine of the sixteen living presidential children, and grandchildren up to six “greats” dating back to John Adams attended. It was a non-political get-together but Helen Taft insisted that the children of the conservative presidents before World War I had had more fun than the children of the later liberal chief executives. She recalled that they slid down the state stairways on trays and had played hide and seek all over the White House.
A few presidential sons may have added special luster to their parents’ reputation by virtue of their military service. Quentin Roosevelt, the youngest son of Theodore and Edith Roosevelt, perished in an air duel over France in 1918 during World War I, the only child of a presidential couple killed in action for his country. Two other Roosevelts were wounded in the war, leading Roosevelt to say proudly, “Haven’t I bully boys, one dead and two in the hospital.” Theodore Roosevelt, Jr. landed in the first wave at Normandy in 1944 at the age of fifty-seven, and died of a heart attack a month later. He was awarded posthumously the Congressional Medal of Honor.
This highest military decoration had been won also by Webb C. Hayes, son of Rutherford and Lucy Hayes (1877-1881), for gallantry in the Philippines campaign during the Spanish-American War when he infiltrated the enemy lines alone at night. In the Civil War, sons of presidents fought on both sides. Five presidential sons were active in the Confederate cause. Two of John Tyler’s sons, David and John, left school as mere striplings to wear the gray uniform. The only son of Zachary and Margaret Taylor, Robert Taylor, saw heavy action as a brigadier general in the Confederate army, having been appointed by Jefferson Davis. He was killed when he was thrown from his horse. He had served as military aide to his father during the Mexican War and as his private secretary during his presidency. Charles Johnson, the son of Andrew and Eliza McCardle Johnson, was an assistant surgeon with the Middle Tennessee Infantry. Shortly before the end of the war he, too, was thrown by his horse and killed. Son Robert was a Union Army colonel who resigned after his father became president in order to serve as his secretary. Frederick Dent Grant was only eleven years old when the Civil War broke out but he often accompanied his father into battle, and was slightly wounded in the Battle of Vicksburg. He was a seasoned soldier at the time he entered the United States Military Academy at West Point in 1866. All four of Franklin and Eleanor Roosevelt’s sons were in uniform during World War II, and all of them saw heavy action.
John Eisenhower, the only child of Dwight and Mamie Eisenhower, also a West Pointer, was a major in combat in Korea when his father was nominated by the Republican party in 1952. John’s superiors immediately forbade him to lead hazardous patrols lest he be captured and turned into an invaluable hostage. An effect was to damage his chances for promotion. Subsequently, Truman sent for him to be present at his father’s inauguration. As Eisenhower rode with Truman to the Capitol, keenly aware of the likely damage done to his son’s career, he angrily asked the outgoing president who had ordered his son home. Truman bluntly answered that he had taken that action as commander-in-chief.
Sons of presidents have served in high places in presidential administrations after their fathers’, no doubt trading on their name. Robert Lincoln served as secretary of war for Presidents Garfield and Arthur. James R. Garfield was secretary of the interior under Theodore Roosevelt. Herbert Hoover, Jr. was undersecretary of state for three years in the Eisenhower presidency. Franklin D. Roosevelt, Jr. was undersecretary of commerce in the Kennedy administration and was elected to Congress from New York three times in the 1980s. James Roosevelt, another son, served in the House from his district in California for six terms beginning in 1954. Harry A. Garfield, the president’s eldest son, was for twenty-five years president of Williams College in the early twentieth century. In the same era, John Tyler’s youngest son, Lyon Gardiner Tyler, born after his father’s term, was president of William and Mary College. John Eisenhower, who has written some well-regarded books of military history, was named by Nixon as ambassador to Belgium.
In general, presidential children have not cast discredit on their parents reputation, even when they have not reflected glory on them. In a more straitlaced time than the present, the many divorces of Franklin and Eleanor Roosevelt’s five children (they accumulated fourteen among them) caused no political repercussions. And Ronald Reagan’s widespread popularity was not affected by the pinpricks he endured from his offspring. Maureen, a child of the president’s first marriage to Jane Wyman, raised the eyebrows of many Americans when she reported having seen Lincoln’s ghost in the White House. The president’s adopted son, Michael, wrote an exposé of his troubled childhood that he tellingly called On the Outside Looking In (1988). And in 1986 Patricia Ann Reagan (known professionally as Patti Davis) published a devastating roman à clef about the Reagans entitled Home Front.
Several presidential children died while their fathers were in the White House. Willy Lincoln, eleven years old, succumbed to a “bilious fever” in 1862. Calvin Coolidge, Jr., sixteen years old in 1924, fell a victim of blood poisoning that resulted from a blister on his foot after playing tennis on a White House court. Patrick Bouvier Kennedy, two days old, in 1963 succumbed to hyaline membrane disease, an ailment of newborns. John Quincy Adams’s son, George Washington Adams, while mentally de-ranged, jumped or fell from a steamer in Long Island Sound at the age of twenty-eight in 1829. Young Adams’s uncle Charles, the second son of John Adams, had died a drunkard’s death in 1800. With characteristic candor the second president had expressed his unspeakable anguish over the youth: “I renounce him. King David’s Absolom had some ambition and some enterprise. Mine is a mere rake, buck, blood, and beast.” Franklin and Jane Appleton Pierce’s only surviving son, Benny, eleven years old, was killed before his parents’ eyes in a railroad accident two months before the father took office. Mrs. Pierce was so grief-stricken that she did not attend the inauguration, and forever blamed her husband’s political ambition for their irreparable loss.
The effect on the presidencies of which these sons were a part cannot be measured. The Lincolns were famously inconsolable, and Coolidge wrote in his autobiography that “when [young Calvin] went, the power and glory of the Presidency went with him….I don’t know why such a price was exacted for occupying the White House.” The Clevelands’ first child, Ruth, whose death of diphtheria at the age of twelve, a few years after they left the White House, crushed her parents, had been known as Baby Ruth. Her memory survives in the name of a candy bar still popular.
In pre-feminist days, presidents’ daughters were sometimes judged by the kind of marriages they made. Zachary Taylor’s daughter Sarah Knox Taylor was wedded to Jefferson Davis in 1835—despite her parents’ impassioned objection—this before either her father or her husband had become a national figure. Eleanor Wilson in 1914 married William Gibbs McAdoo, the secretary of the treasury, a marriage that later ended in divorce. She died a recluse in India. In an elaborate White House wedding in 1906, Alice Roosevelt became the bride of Nicholas Long-worth, a congressman from Ohio who years later served as Speaker of the House of Representatives.
Margaret Wilson, Woodrow and Ellen Axson Wilson’s second daughter, and Margaret Truman, Harry and Bess Wallace Truman’s only child, were aspiring concert singers. In a moment of pique, Truman denounced in a well-publicized letter the hostile review on Margaret’s talent that appeared in the Washington Post. She would afterward write of the difficult position her father’s office placed her in: “If some critics condoned or over praised me because of my political position, others accused me of trading on my father’s prestige.” In later years she was the author of several books of crime fiction. Helen Taft served as hostess in the White House during her mother’s incapacity. She made her social debut there in 1910 in a pink chiffon dress, a color that came to be labeled “Helen pink.” She earned a Ph.D. from Yale and had a notable career as dean and professor of history at Bryn Mawr College, where her authoritarian style became legendary.
Only two presidential sons have returned to the White House as presidents themselves. John and Abigail Adams’s son, John Quincy, was elected in early 1825 in the House of Representatives. On that occasion his father said: “No man who ever held the office of President would congratulate a friend on obtaining it.” Still, he could not suppress his boundless pride: “The multitude of my thoughts and the intensity of my feelings are too much for a mind like mine, in its ninetieth year.” When George W. Bush, the son of George and Barbara Bush, was elected in 2000, the elder Bush’s emotions were no less intense, and he playfully referred to his son for a while as “Quincy” or simply “Q.” Father and son also joshed each other as “41” and “43,” a reference to their numbered places on the roll of the presidents. When the war against terrorism began in 2001, the elder Bush said he and the embattled commander-in-chief talked to each other regularly. “It’s not always about policy. It’s not, ‘What do you think, Dad, I should be doing?’ It is more the relationship of a very close family staying in touch.”
In 1848, John Van Buren, the second of the four sons of Martin and Hannah Van Buren, was prominently mentioned as a potential candidate for the Free Soil Party. He withdrew from consideration in favor of his father. An earnest band of Whigs in 1856 had boomed for president, John Scott Harrison, son of William Henry Harrison and Anna Symmes Harrison. In dismissing the notion of his candidacy the younger Harrison said forthrightly that his backers efforts were “calculated too largely on the potency of a name.” He served in the House for two terms in the 1850s as a Whig. And he and his wife, Elizabeth Irwin Harrison were the parents of President Benjamin Harrison, making him the only son and father of a president. In a bizarre incident in 1878, John Scott Harrison’s body was stolen by grave robbers who sold it to a medical school in Cincinnati, where his horrified son John by chance discovered it hanging from a rope.
Robert Todd Lincoln, the sole surviving son of the Lincolns, in 1884 received eight votes for president at the Republican National Convention. Four years later a group of Republican planners imagined that he could be an admirable running mate for a ticket headed by Frederick Dent Grant and that a Grant-Lincoln ticket would be a sure winner. In the 1940s and early 1950s Senator Robert A. Taft of Ohio, the son of William Howard and Helen Herron Taft, widely regarded as “Mr. Republican,” ardently sought the Republican nomination. He was overwhelmed at successive conventions by Wendell L. Willkie of Indiana in 1940, Thomas Dewey of New York in 1944 and 1948, and Dwight D. Eisenhower in 1952.
The peaceful transfer of power from one president to another on inauguration day is a phenomenon that Americans take for granted, but in truth it is a remarkable tribute to the success of republican government in the United States. When John Adams succeeded Washington on 4 March 1797, he wrote his beloved Abigail: “The sight of the sun setting…and another rising (though less splendid), was a novelty.” It has remained a novelty in many nations of the world. Nevertheless, the transition from one administration to another has not always been smooth. John Adams, still smarting over the outcome of the election of 1800, did not remain in Washington to witness the inauguration of Jefferson, his successor. And John Quincy Adams in 1829 also fled the town, unwilling to be present to see Andrew Jackson’s accession to the White House. In 1869, Andrew Johnson was angrily conducting a cabinet meeting even as his successor, General Grant, was being inaugurated.
Periodically there has also been bad blood between presidents and aspiring successors within the same party. In the presidential canvass of 1928 Calvin Coolidge gave only lukewarm support to the Republican nominee, Herbert Hoover, his secretary of commerce, because he disliked him intensely. He often referred to Hoover derisively as the “Wonder Boy,” and sometimes said of him: “That man has given me nothing but advice, and all of it bad.” Lyndon Johnson prized his so-called idea folder containing his collection of constantly growing possible proposals for public initiatives. In 1968 he adamantly refused to lend it to Vice President Hubert H. Humphrey, the Democratic party nominee to succeed him, lest it be helpful to him in the campaign. Bill Clinton in his post-presidency remained highly critical of former vice president Al Gore, his successor as Democratic party nominee, believing that Gore would have won in 2000 if he had invited Clinton to go on the campaign trail for him. The two men who began their era at the helm as seemingly boon companions were barely on speaking terms. By the time of the presidential election of 1912, the former intimate friendship between Theodore Roosevelt and William Howard Taft, his successor in office (whose election in 1908 TR was largely responsible for) had broken apart, in Taft’s words, “like a rope of sand.”
Incoming presidents of the same party as the outgoing one invariably emphasize in their public utterances their aim to continue ongoing policies. At the same time, they try to establish their own identity quickly. When there is a party turnover, the incoming president typically asserts that sweeping changes are in the offing. Still, the comity between the president arriving and the one departing befits their inseparable place together in the line of the chief executives. Although Eisenhower and Truman were at odds at the time of Ike’s inauguration, Eisenhower discovered in his first moments in the Oval Office that Truman had left for him a collection of memoranda on vital issues he might have to deal with immediately.
This transfer of power was once conducted informally according to the taste of the incumbent. Millard Fillmore, for example, invited President-elect Franklin Pierce to join him at a lecture by the famed English writer, William Makepeace Thackeray, then a “hot ticket” in Washington. Afterward, Thackeray and Washington Irving, who had come to Washington to hear him, were, in company with Pierce, guests at a dinner the Fillmores hosted at the White House.
In 1944, at the zenith of World War II, Franklin D. Roosevelt began the practice of giving national security briefings to his opponents. After the election of 1960, Dwight D. Eisenhower initiated the custom of creating a presidential liaison group to work with his successor’s team. Lyndon B. Johnson gave personal briefings to the men running to succeed him and, under the direction of one of his staff, elaborate briefing books were prepared in every major segment of the executive branch to help orient the incoming Nixon administration. Johnson was eager that the period of transition not provide a moment of weakness in national security, believing that Soviet missiles had been moved at the time of Kennedy’s assassination.
George H. W. Bush spoke these words to President-elect Clinton after showing him around the White House following the election of 1992: “Bill, I want to tell you something. When I leave here, you’re going to have no trouble with me. The campaign is over, it was tough and I’m out of here. I will do nothing to complicate your work and I just want you to know that.” When Taft and Wilson rode to the Capitol for Wilson’s inauguration in 1913, they were on such cordial terms that Taft confided to his successor he had been able to save $100,000 out of his salary as president. And nothing showed off better the amity of a transition than the picture of Grover Cleveland in 1889 holding an umbrella over the head of Benjamin Harrison as he delivered his inaugural address in a drenching rain.
Ex-presidents have usually but not always spoken without criticism or rancor about their successors—even those who defeated them at the polls. During the campaign of 2000, ex-President Bush felt constrained to say that he would finally have to speak his mind about President Clinton if Clinton persisted in disparaging Bush’s son, George, the Republican nominee for president. When President George W. Bush was in office only six months, Jimmy Carter sharply scorched him for his conservative bent, particularly in foreign affairs: “I am disappointed in almost everything he has done.” President Clinton made it clear even before he left office in 2001 that he believed Vice President Gore had won the election of 2000 and that the Supreme Court’s decision preventing a recount in Florida and giving the victory to Bush was faulty. Speaking in Washington State he said sarcastically to cheering Democratic party partisans: “They have this unusual system [here]. They actually count all the votes.”
Before the Twentieth Amendment, which changed the day of inauguration from 4 March to 20 January, a president had nine months of grace before the new Congress took office. The change brings the new Congress into office on 3 January, so the president-elect now has only about ten weeks in which to prepare to take over the White House. In 1952 to finance his transition task force Eisenhower had used money provided by a special Republican party committee. In 1960 Kennedy used leftover campaign funds to finance his transition task force. Recognizing the ad hoc nature of this arrangement, Kennedy appointed a Commission on Campaign Costs to examine the problem and to make recommendations. As a result, Congress passed the Presidential Transition Act of 1963, which authorized payment for office space, transportation, travel, and staff. The sum allowed was $900,000 and it has been increased from time to time.
Under the Presidential Transition Act of 2000, signed by President Clinton, the sum rose to $7,100,000: $1,830,000 for the outgoing administration, with the understanding that $305,000 would be returned to the Treasury if Vice President Gore was elected, and $4,270,000 for the incoming administration. The newcomers’ portion would be available from the day after the election to thirty days after the inauguration. For the outgoing president and vice president the money would be at their disposal for seven months, beginning one month before the inauguration. The disputed election of 2000 made immediate payment impossible: no “apparent victor,” as the law requires, had been identified. Both parties, therefore, were forced to raise private money. The law, incidentally, requires that the names of donors be made public and that no gift exceed $5,000.
Clearly the transfer of the presidency is more complicated than ever. Ideally the money appropriated enables a new administration to “hit the ground running.” But the exigencies of modern government make that impossible. Today there are 3,000 presidential positions requiring neither a civil service examination nor confirmation by the Senate. Some of them are in the Senior Executive Service, a roster of high rank civil servants, usually consisting of people with special qualifications. As for the remainder of these jobs, in the last fifteen years, between 1,100 and 1,500 of them have been filled at any one time under Schedule C—posts outside the civil-service merit system and regarded as policy determining or involving a close and confidential relationship with a key official in the administration. These important and valued offices, despite a 25 percent vacancy rate, require security clearance, and the time this process consumes varies from a few weeks to several months.
The Clinton administration was not truly in place until October, 1993—about nine months after Clinton took office. The predecessor administration under George H. W. Bush had required about the same length of time. When Richard Nixon went to Washington for his inauguration in January 1969, he had in hand about seventy-five appointees despite earnest efforts at recruiting that included circularizing the likely entries in Who’s Who in America. Five months after becoming secretary of state in the administration of George W. Bush, Colin Powell felt himself handicapped by still having a mere skeleton staff in place. By January 2002, the administration had made only 920 Schedule C appointments. In recent years a steady call has been heard to reduce the size of Schedule C, but no incoming president has yet seen fit to limit this vestige of the spoils system.
The Presidents’ Papers
The presidential library system was created in 1939 when Franklin D. Roosevelt, aware of the swelling quantity of presidential documents, donated his papers and a portion of his private estate at Hyde Park, New York, to the federal government. His plan was to establish a repository open to the public for the study of the presidency. Previously, presidential papers were handled on an ad hoc basis by presidents and their heirs and thus scattered throughout the country, to be found today in the collections of various museums, libraries, and historical societies. Some, like Millard Fillmore’s and Chester Arthur’s, were deliberately destroyed. Many presidents’ papers are to be found in the Library of Congress.
Today there are ten libraries in the presidential library system. The quantity of papers, now supplemented by extensive electronic records, seems to grow with each succeeding presidency. Incredible as it may seem, Lyndon Johnson took back with him to his library in Austin, Texas, twenty-five hundred five-drawer filing cabinets. Under an act of Congress passed in 1955, the libraries are constructed by private and nonfederal means but are maintained by public funds. Under the Presidential Records Act of 1978, a president’s papers belong to the nation, and as soon as an administration ends the Archivist of the United States takes possession of them. No longer may presidents consider the papers generated in their time in the White House as their private property. So when an administration leaves office, it leaves no files behind.
Salary and Pension
In 2001, the president’s salary was raised to $400,000 a year, and it is expected that no former president will be strapped for money, a condition that has not always obtained. Jefferson was so impoverished that in 1815 he felt forced to sell his library to the government—forming the nucleus of today’s Library of Congress—in order to pay his creditors. Monroe was in such dire straits that, after his wife’s death, he moved in with his daughter in New York City. He was buried there because there was no money to send his remains back to Virginia. Not until 1858, in celebration of the centennial of his birth, was he reinterred, in Richmond, thanks to admirers. Ulysses S. Grant was forced into poverty in his last years by a colossal stock fraud that swallowed his savings. The $450,000 advance he received for his Memoirs proved to be the only way to provide for his family. It came as he was suffering from throat cancer and hoping to finish his book before he died.
Harry Truman was so poor upon his return to Missouri that he had to move into his mother-in-law’s house. He hoped for some relief through the passage of a pension bill, but, for inexplicable reasons, Sam Rayburn, the Speaker of the House, sat on the proposal year after year. When it finally became law during the Eisenhower administration, the pension amounted to $25,000—much welcomed by Truman. The only other living ex-president was Herbert Hoover, a millionaire many times over, who had never taken a salary as president. But he accepted the pension anyway, because, he said, he did not wish to embarrass his friend, Harry Truman. Their friendship transcended their differing party affiliation, as has been the case among most former presidents. They considered themselves equal members of the most exclusive club in the land. When Truman invited Hoover to attend the dedication of the Truman Library in Independence, Missouri, he inquired whether politics would stand in the way of his accepting. According to Truman, Hoover responded: “Of course not, that soldier boy in the White House [General Eisenhower] isn’t listening to either of us.”
Under the Former Presidents Act that Eisenhower signed, ex-presidents are entitled to a pension tied in amount to the salary of members of the cabinet. In 2001 this was $161,000. In addition they receive the franking privilege—free mailing—for all nonpolitical correspondence, government paid-for office space and office staff, and allowances for travel. The sum is now a $2.5 million annual entitlement. In addition they have lifetime Secret Service protection for themselves and their spouse and for their children until they reach the age of sixteen. Widows are protected until they remarry.
The acronym POTUS (the “O” being long, as in “toe”) is in common parlance in the White House today, used by in-the-know staffers to refer to the President of the United States. It is never uttered in addressing him face-to-face. POTUS long ago existed in the telegraph code that was a bible of the major news wires. And it is said that when President Franklin D. Roosevelt traveled on the Pennsylvania Railroad in his private car, Magellan, POTUS was the cover word employed to identify this important passenger. “POTUS to PRIME” was sometimes the heading of FDR’s aides—not the president himself—placed on his correspondence with Prime Minister Churchill in the era of World War II.
POTUS’s public emergence began when buttons on White House phones linked directly to the president were labeled thus in President Lyndon B. John-son’s time. The word came into currency during President Jimmy Carter’s term, and it was picked up as shorthand by the Secret Service, matching SCOTUS which was becoming the favorite acronym for the Supreme Court of the United States. Nancy Reagan inspired but may not have originated FLOTUS (rhyming with POTUS) to specify the First Lady. VPOTUS (pronounced vee-potus) to indicate the vice president was occasionally heard in the same era to refer to George Bush, then holding office. Its use became ordinary when Al Gore was vice president in Bill Clinton’s administration. The word VEEP to describe the vice president became popular in President Harry Truman’s day as a nickname for Vice President Alben W. Barkley, being simply a contracted pronunciation of VP, the common abbreviation for vice president.
Kinds of Presidencies
The staffing of a presidency is vital in shaping its character. But each presidency is ultimately stamped by the personality and inclinations of the incumbent, modified by the fortuitous circumstances that force themselves upon his term. Kennedy said that to judge a president one has to know “what he had going for him.” George Washington cut the pattern for the early presidencies which were aimed at making a nation by constructing a sense of national unity out of the varying sectional interests. Above all, aware of the mode of British politics in his day, he was opposed to factionalism—although in the end he could not prevent its rise—and the development of “the demon of party spirit.” John Adams, too, tried to eschew politics. Nevertheless, the election of Jefferson in 1800 forced a change. He was the leader of the party—although the president as his party’s leader was not accepted for another century. However, the role of the president as leader in foreign affairs and in military matters—foreshadowed by Washington’s handlings of the Whiskey Rebellion in 1794—was firmly set by Jefferson’s purchase of the Louisiana territory in 1803 and by Madison’s unambiguous role as commander-in-chief of the army during the War of 1812.
Jackson’s election in 1828 after a long and celebrated military career, brought to the White House the first “man of the people.” As such Jackson could sway Congress and shape legislation, so that he may be regarded as the maker of the modern presidency. Even so, its character has been constantly refashioned. The emergence of the slavery issue dwarfed the presidents and the candidates for the office for a long generation. None of Jackson’s eight successors was reelected. The power of the presidency was reasserted and dramatized by Lincoln in his ultimately successful conduct of the Civil War. He drew to himself greater power than any that had ever been exercised. Although he had come into politics as a Whig who believed that the president is entitled only to limited authority, Lincoln became, in practice, a Jacksonian. His immediate successors, enveloped in domestic questions, including the racial upheaval caused by emancipation, tended to offer few programs for Congress to act upon. Rather, they saw the presidency as coordinate with Congress, not its director.
Social problems at the beginning of the twentieth century offered new opportunities for a creative presidency. The need and the man were combined in the Republican Theodore Roosevelt, a war hero and dynamo of energy, who used the penny press and the muckraking magazines to make himself a leader of popular causes, including the management of overseas “possessions” and the assault on over-weening trusts. He regarded the presidency as a “bully pulpit,” an unmatched place from which to advocate a program. TR made himself the undisputed leader of the nation and his party, the source of innovation in public policies and the principal maker of political news. This conception of the presidency transcended party lines. The Democrat Woodrow Wilson, who came to the White House in 1913, was a former professor of political science who admired immensely the British parliamentary system. Still, he came to accept the Rooseveltian view of the presidency and to believe that if the president correctly interprets the public temper and sets forth appropriate programs, he is “irresistible.”
In the 1920s and early 1930s the successive presidents were relatively supine. Calvin Coolidge said that it was not the role of the president to send bills to Congress but the business of Congress to send the president bills to sign. However, the example of Jackson, Lincoln, Theodore Roosevelt, and Wilson remained in the bloodstream of the office. The terrible force of the Great Depression allowed Franklin D. Roosevelt to exceed even his distant cousin Theodore’s broadened presidency. He regarded the White House as essentially a place for moral leadership. Nevertheless, in his inaugural address he promised to seek “broad Executive power to wage war against the emergency.” When the threat of the totalitarian powers to the security of the United States came to seem an even greater emergency, his unprecedented use of the prestige of his office made the presidency more powerful than ever. The high point came in 1942 when FDR said that if Congress did not repeal a certain law, he himself would repeal it.
In the aftermath of World War II, the power of the office was potentially greater than ever. President Truman’s remarkable initiatives in foreign policy which included the Marshall Plan, the Truman Doctrine, and participation in the Korean War, extended more tangibly than ever the leadership of the presidency beyond the borders of the United States. Subsequently, Eisenhower’s immortal name as the Liberator of Europe enabled him to be less aggressive and strident in exercising leadership during the cold war than Truman had been. Kennedy’s inheritance from Ike of an all-powerful office made possible his now-famous defy to “every nation, whether it wishes us well or ill, that we shall pay any price, bear any burden, meet any hardship, support any friend, oppose any foe, in order to assure the survival and the success of liberty.”
This vastly enlarged conception, labeled by critics the “Imperial Presidency,” came to grief in Kennedy’s abortive invasion of the Bay of Pigs, and it suffered disgrace in the administration of Richard Nixon, tarnished by the Watergate episode and related crimes. Next, the failed war in Vietnam and the hostage crisis in Iran stained respectively the Johnson and Carter administrations and further weakened the hand and authority of the presidency abroad. Although President Reagan by his ebullient style and rhetoric could insist that it was once more “Morning in America,” the presidency had been shorn of some of its power, and substantially lessened in prestige.
A number of events further sullied the office, notably the Iran-contra affair in the 1980s and the personal acts of President Clinton that led to his impeachment and trial in 1998. But by the serendipity of history, the terrorist attacks on the United States on 11 September 2001 showed once again the vitality and power of the presidential office. President George W. Bush, in masterly fashion, drew the country together for the long twilight struggle that lay ahead. Shifting gears from domestic concerns that had almost exclusively dominated his plans, he transformed his presidency overnight, employing the explicit and implied powers of the office to meet the mortal threat to the homeland. The president of the United States had become, like some of his great predecessors, the voice of hope and inspiration for freedom-loving people everywhere in the world.