Figure 7.1 On September 14, 2001, President George W. Bush addresses the crowd at Ground Zero in New York City (left). President Joe Biden takes the oath of office in front of the U.S. Capitol on January 20, 2021 (right). (credit left: modification of "EMA - 3905 - Photograph by SFC Thomas R. Roberts taken on 09-14-2001 in New York" by SFC Thomas R. Roberts/Wikimedia Commons, Public Domain; credit right: modification of "President Joe Biden, joined by First Lady Jill Biden and their children Ashley Biden and Hunter Biden, takes the oath of office as President of the United States Wednesday, Jan. 20, 2021, during the 59th Presidential Inauguration at the U.S. Capitol in Washington, D.C. (Official White House Photo by Chuck Kennedy)" by The White House/Wikimedia Commons, Public Domain)
Introduction
The presidency is the most visible position in the U.S. government (Figure 7.1). During the Constitutional Convention of 1787, delegates accepted the need to empower a relatively strong and vigorous chief executive. But they also wanted this chief executive to be bound by checks from the other branches of the federal government as well as by the Constitution itself. Over time, the power of the presidency has grown in response to circumstances and challenges. However, to this day, a president must still work with the other branches to be most effective. Unilateral actions, in which the president acts alone on important and consequential matters, such as President Barack Obama’s strategy on the Iran nuclear deal, are bound to be controversial and suggest potentially serious problems within the federal government. Effective presidents, especially in peacetime, are those who work with the other branches through persuasion and compromise to achieve policy objectives.
What are the powers, opportunities, and limitations of the presidency? How does the chief executive lead in our contemporary political system? What guides the chief executive's actions, including unilateral actions? If it is most effective to work with others to get things done, how does the president do so? What can get in the way of this goal? This chapter answers these and other questions about the nation’s most visible leader.
7.1 The Design and Evolution of the Presidency
Learning Objectives
By the end of this section, you will be able to:
- Explain the reason for the design of the executive branch and its plausible alternatives
- Analyze the way presidents have expanded presidential power and why
- Identify the limitations on a president's power
Since its invention at the Constitutional Convention of 1787, the presidential office has gradually become more powerful, giving its occupants a far-greater chance to exercise leadership at home and abroad. The role of the chief executive has changed over time, as various presidents have confronted challenges in domestic and foreign policy in times of war as well as peace, and as the power of the federal government has grown.
INVENTING THE PRESIDENCY
The Articles of Confederation made no provision for an executive branch, although they did use the term “president” to designate the presiding officer of the Confederation Congress, who also handled other administrative duties.1 The presidency was proposed early in the Constitutional Convention in Philadelphia by Virginia’s Edmund Randolph, as part of James Madison’s proposal for a federal government, which became known as the Virginia Plan. Madison offered a rather sketchy outline of the executive branch, leaving open whether what he termed the “national executive” would be an individual or a set of people. He proposed that Congress select the executive, whose powers and authority, and even length of term of service, were left largely undefined. He also proposed a “council of revision” consisting of the national executive and members of the national judiciary, which would review laws passed by the legislature and have the power of veto.2
Early deliberations produced agreement that the executive would be a single person, elected for a single term of seven years by the legislature, empowered to veto legislation, and subject to impeachment and removal by the legislature. New Jersey’s William Paterson offered an alternate model as part of his proposal, typically referred to as the small-state or New Jersey Plan. This plan called for merely amending the Articles of Confederation to allow for an executive branch made up of a committee elected by a unicameral Congress for a single term. Under this proposal, the executive committee would be particularly weak because it could be removed from power at any point if a majority of state governors so desired. Far more extreme was Alexander Hamilton’s suggestion that the executive power be entrusted to a single individual. This individual would be chosen by electors, would serve for life, and would exercise broad powers, including the ability to veto legislation, the power to negotiate treaties and grant pardons in all cases except treason, and the duty to serve as commander-in-chief of the armed forces (Figure 7.2).
Figure 7.2 Alexander Hamilton (a), who had served under General George Washington (b) during the Revolutionary War, argued for a strong executive in Federalist No. 70. Indeed, ten other Federalist Papers discuss the role of the presidency.
Debate and discussion continued throughout the summer. Delegates eventually settled upon a single executive, but they remained at a loss for how to select that person. Pennsylvania’s James Wilson, who had triumphed on the issue of a single executive, at first proposed the direct election of the president. When delegates rejected that idea, he responded with the suggestion that electors, chosen throughout the nation, should select the executive. Over time, Wilson’s idea gained ground with delegates who were uneasy at the idea of an election by the legislature, which presented the opportunity for intrigue and corruption. The idea of a shorter term of service combined with eligibility for reelection also became more attractive to delegates. The framers of the Constitution struggled to find the proper balance between giving the president the power to perform the job on one hand and opening the way for a president to abuse power and act like a monarch on the other.
By early September, the Electoral College had emerged as the way to select a president for four years who was eligible for reelection. This process is discussed more fully in the chapter on elections. Today, the Electoral College consists of a body of 538 people called electors, each representing one of the fifty states or the District of Columbia, who formally cast votes for the election of the president and vice president (Figure 7.3). In forty-eight states and the District of Columbia, the candidate who wins the popular vote in November receives all the state’s electoral votes. In two states, Nebraska and Maine, the electoral votes are divided: The candidate who wins the popular vote in the state gets two electoral votes, but the winner of each congressional district also receives an electoral vote.
Figure 7.3 This map shows the distribution by state of delegate votes available in the 2024 national election. The number of Electoral College votes granted to each state equals the total number of representatives and senators that state has in the U.S. Congress or, in the case of Washington, DC, as many electors as it would have if it were a state. The number of representatives may fluctuate based on state population, which is determined every ten years by the U.S. Census.
In the original design implemented for the first four presidential elections (1788–89, 1792, 1796, and 1800), the electors cast two ballots (but only one could go to a candidate from the elector’s state), and the person who received a majority won the election. The second-place finisher became vice president. Should no candidate receive a majority of the votes cast, the House of Representatives would select the president, with each state casting a single vote, while the Senate chose the vice president.
While George Washington was elected president twice with this approach, the design resulted in controversy in both the 1796 and 1800 elections. In 1796, John Adams won the presidency, while his opponent and political rival Thomas Jefferson was elected vice president. In 1800, Thomas Jefferson and his running mate Aaron Burr finished tied in the Electoral College. Jefferson was elected president in the House of Representatives on the thirty-sixth ballot. These controversies led to the proposal and ratification of the Twelfth Amendment, which couples a particular presidential candidate with that candidate’s running mate in a unified ticket.3
For the last two centuries or so, the Twelfth Amendment has worked fairly well. But this doesn’t mean the arrangement is foolproof. For example, the amendment created a separate ballot for the vice president but left the rules for electors largely intact. One of those rules states that the two votes the electors cast cannot both be for “an inhabitant of the same state with themselves.”4 This rule means that an elector from, say, Louisiana, could not cast votes for a presidential candidate and a vice presidential candidate who were both from Louisiana; that elector could vote for only one of these people. The intent of the rule was to encourage electors from powerful states to look for a more diverse pool of candidates. But what would happen in a close election where the members of the winning ticket were both from the same state?
The nation almost found out in 2000. In the presidential election of that year, the Republican ticket won the election by a very narrow electoral margin. To win the presidency or vice presidency, a candidate must get 270 electoral votes (a majority). George W. Bush and Dick Cheney won by the skin of their teeth with just 271. Both, however, were living in Texas. This should have meant that Texas’s 32 electoral votes could have gone to only one or the other. Cheney anticipated this problem and had earlier registered to vote in Wyoming, where he was originally from and where he had served as a representative years earlier.5 It’s hard to imagine that the 2000 presidential election could have been even more complicated than it was, but thanks to that seemingly innocuous rule in Article II of the Constitution, that was a real possibility.
Despite provisions for the election of a vice president (to serve in case of the president’s death, resignation, or removal through the impeachment process), and apart from the suggestion that the vice president should be responsible for presiding over the Senate, the framers left the vice president’s role undeveloped. As a result, the influence of the vice presidency has varied dramatically, depending on how much of a role the vice president is given by the president. Some vice presidents, such as Dan Quayle under President George H. W. Bush, serve a mostly ceremonial function, while others, like Dick Cheney under President George W. Bush, become a partner in governance and rival the White House chief of staff in terms of influence.
Read about James Madison’s evolving views of the presidency and the Electoral College.
In addition to describing the process of election for the presidency and vice presidency, the delegates to the Constitutional Convention also outlined who was eligible for election and how Congress might remove the president. Article II of the Constitution lays out the agreed-upon requirements—the chief executive must be at least thirty-five years old and a “natural born” citizen of the United States (or a citizen at the time of the Constitution’s adoption) who has been an inhabitant of the United States for at least fourteen years.6 While Article II also states that the term of office is four years and does not expressly limit the number of times a person might be elected president, after Franklin D. Roosevelt was elected four times (from 1932 to 1944), the Twenty-Second Amendment was proposed and ratified, limiting the presidency to two four-year terms.
An important means of ensuring that no president could become tyrannical was to build into the Constitution a clear process for removing the chief executive—impeachment. Impeachment is the act of charging a government official with serious wrongdoing; the Constitution calls this wrongdoing high crimes and misdemeanors. The method the framers designed required two steps and both chambers of the Congress. First, the House of Representatives could impeach the president by a simple majority vote. In the second step, the Senate could remove the president from office by a two-thirds majority, with the chief justice of the Supreme Court presiding over the trial. Upon conviction and removal of the president, if that occurred, the vice president would become president.
Four presidents have faced impeachment proceedings in the House; none has been both impeached by the House and removed by the Senate. In the wake of the Civil War, President Andrew Johnson faced congressional contempt for decisions made during Reconstruction. President Richard Nixon faced an overwhelming likelihood of impeachment in the House for his cover-up of key information relating to the 1972 break-in at the Democratic Party’s campaign headquarters at the Watergate hotel and apartment complex. Nixon likely would have also been removed by the Senate, since there was strong bipartisan consensus for his impeachment and removal. Instead, he resigned before the House and Senate could exercise their constitutional prerogatives.
The 1990s brought the impeachment of President Bill Clinton, brought on by his lying about an extramarital affair with a White House intern named Monica Lewinsky. Voting fell largely along party lines. House Republicans felt the affair and Clinton’s initial public denial of it rose to a level of wrongdoing worthy of impeachment. House Democrats believed it fell short of an impeachable offense and that a simply censure made better sense. Clinton's trial in the Senate went nowhere because too few Senators wanted to move forward with removing the president.
The most recent impeachments were of President Donald Trump, who was impeached in the House twice. However, support for removal in the Senate did not meet the super-majority requirement, although on the second attempt in 2021 a solid majority favored removal. The first Trump impeachment brought charges of “abuse of power” and “obstruction of Congress” related to allegations that he improperly used his office to seek help from Ukrainian officials to facilitate his re-election. The second Trump impeachment was for “incitement of insurrection” related to the attack on the U.S. Capitol building during the counting of Electoral College votes on January 6, 2021. This second impeachment led to Republicans supporting impeachment in the House, including Representative Liz Cheney (R-WY), one of the central party leaders, and removal in the Senate, including Senator Mitt Romney (R-UT).7 Ongoing federal investigations of the insurrection continue, and an attempt to launch an independent commission to investigate the event (similar to the 9/11 commission) passed the House, but was blocked by Republicans in the Senate.8
Looking across the span of U.S. history, impeachment of a president remains a rare event indeed and removal has never occurred. However, with three of the five impeachment trials having occurred in the last twenty-five years, and with two of the five most recent presidents having faced impeachment, it will be interesting to watch if the trend continues in our partisan era. The fact that a president could be impeached and removed is an important reminder of the role of the executive in the broader system of shared powers. The same outcome occurred in the case of Andrew Johnson in the nineteenth century though he came closer to the threshold of votes needed for removal than did Clinton.
The Constitution that emerged from the deliberations in Philadelphia treated the powers of the presidency in concise fashion. The president was to be commander-in-chief of the armed forces of the United States, negotiate treaties with the advice and consent of the Senate, and receive representatives of foreign nations (Figure 7.4). Charged to “take care that the laws be faithfully executed,” the president was given broad power to pardon those convicted of federal offenses, except for officials removed through the impeachment process.9 The chief executive would present to Congress information about the state of the union; call Congress into session when needed; veto legislation if necessary, although a two-thirds supermajority in both houses of Congress could override that veto; and make recommendations for legislation and policy as well as call on the heads of various departments to make reports and offer opinions.
Figure 7.4 During visits from foreign heads of state, the president of the United States is often surrounded by representatives of the military, a symbol of the president's dual role as head of state and commander-in-chief. Here, President Barack Obama delivers remarks during a welcoming ceremony for Angela Merkel, chancellor of the Federal Republic of Germany. (credit: Stephen Hassay)
Finally, the president’s job included nominating federal judges, including Supreme Court justices, as well as other federal officials, and making appointments to fill military and diplomatic posts. The number of judicial appointments and nominations of other federal officials is great. In recent decades, two-term presidents have nominated well over three hundred federal judges while in office.10 Moreover, new presidents nominate close to five hundred top officials to their Executive Office of the President, key agencies (such as the Department of Justice), and regulatory commissions (such as the Federal Reserve Board), whose appointments require Senate majority approval.11
THE EVOLVING EXECUTIVE BRANCH
No sooner had the presidency been established than the occupants of the office, starting with George Washington, began acting in ways that expanded both its formal and informal powers. For example, Washington established a cabinet or group of advisors to help him administer his duties, consisting of the most senior appointed officers of the executive branch. Today, the heads of the fifteen executive departments serve as the president’s advisers.12 And, in 1793, when it became important for the United States to take a stand in the evolving European conflicts between France and other European powers, especially Great Britain, Washington issued a neutrality proclamation that extended his rights as diplomat-in-chief far more broadly than had at first been conceived.
Later presidents built on the foundation of these powers. Some waged undeclared wars, as John Adams did against the French in the Quasi-War (1798–1800). Others agreed to negotiate for significant territorial gains, as Thomas Jefferson did when he oversaw the purchase of Louisiana from France. Concerned that he might be violating the powers of the office, Jefferson rationalized that his not facing impeachment charges constituted Congress’s tacit approval of his actions. James Monroe used his annual message in 1823 to declare that the United States would consider it an intolerable act of aggression for European powers to intervene in the affairs of the nations of the Western Hemisphere. Later dubbed the Monroe Doctrine, this declaration of principles laid the foundation for the growth of American power in the twentieth century. Andrew Jackson employed the veto as a measure of policy to block legislative initiatives with which he did not agree and acted unilaterally when it came to depositing federal funds in several local banks around the country instead of in the Bank of the United States. This move changed the way vetoes would be used in the future. Jackson’s twelve vetoes were more than those of all prior presidents combined, and he issued them due to policy disagreements (their basis today) rather than as a legal tool to protect against encroachments by Congress on the president’s powers.
Of the many ways in which the chief executive’s power grew over the first several decades, the most significant was the expansion of presidential war powers. While Washington, Adams, and Jefferson led the way in waging undeclared wars, it was President James K. Polk who truly set the stage for the broad growth of this authority. In 1846, as the United States and Mexico were bickering over the messy issue of where Texas’s southern border lay, Polk purposely raised anxieties and ruffled feathers through his envoy in Mexico. He then responded to the newly heightened state of affairs by sending U.S. troops to the Rio Grande, the border Texan expansionists claimed for Texas. Mexico sent troops in response, and the Mexican-American War began soon afterward.13
Abraham Lincoln, a member of Congress at the time, was critical of Polk’s actions. Later, however, as president himself, Lincoln used presidential war powers and the concepts of military necessity and national security to undermine the Confederate effort to seek independence for the Southern states. In suspending the privilege of the writ of habeas corpus, Lincoln blurred the boundaries between acceptable dissent and unacceptable disloyalty. He also famously used a unilateral proclamation to issue the Emancipation Proclamation, which cited the military necessity of declaring millions of enslaved people in Confederate-controlled territory to be free. His successor, Andrew Johnson, became so embroiled with Radical Republicans about ways to implement Reconstruction policies and programs after the Civil War that the House of Representatives impeached him, although the legislators in the Senate were unable to successfully remove him from office.14
Over the course of the twentieth century, presidents expanded and elaborated upon these powers. The rather vague wording in Article II, which says that the “executive power shall be vested” in the president, has been subject to broad and sweeping interpretation in order to justify actions beyond those specifically enumerated in the document.15 As the federal bureaucracy expanded, so too did the president’s power to grow agencies like the Secret Service and the Federal Bureau of Investigation. Presidents also further developed the concept of executive privilege, the right to withhold information from Congress, the judiciary, or the public. This right, not enumerated in the Constitution, was first asserted by George Washington to curtail inquiry into the actions of the executive branch.16 The more general defense of its use by White House officials and attorneys ensures that the president can secure candid advice from advisors and staff members.
Increasingly over time, presidents have made more use of their unilateral powers, including executive orders, rules that bypass Congress but still have the force of law if the courts do not overturn them. More recently, presidents have offered their own interpretation of legislation as they sign it via signing statements (discussed later in this chapter) directed to the bureaucratic entity charged with implementation. In the realm of foreign policy, Congress permitted the widespread use of executive agreements to formalize international relations, so long as important matters still came through the Senate in the form of treaties.17 Recent presidents have continued to rely upon an ever more expansive definition of war powers to act unilaterally at home and abroad. Finally, presidents, often with Congress's blessing through the formal delegation of authority, have taken the lead in framing budgets, negotiating budget compromises, and at times impounding funds in an effort to prevail in matters of policy.
The Budget and Accounting Act of 1921
Developing a budget in the nineteenth century was a chaotic mess. Unlike the case today, in which the budgeting process is centrally controlled, Congresses in the nineteenth century developed a budget in a piecemeal process. Federal agencies independently submitted budget requests to Congress, and these requests were then considered through the congressional committee process. Because the government was relatively small in the first few decades of the republic, this approach was sufficient. However, as the size and complexity of the U.S. economy grew over the course of the nineteenth century, the traditional congressional budgeting process was unable to keep up.18
Things finally came to a head following World War I, when federal spending and debt skyrocketed. Reformers proposed the solution of putting the executive branch in charge of developing a budget that could be scrutinized, amended, and approved by Congress. However, President Woodrow Wilson, owing to a provision tacked onto the bill regarding presidential appointments, vetoed the legislation that would have transformed the budgeting process in this way. His successor, Warren Harding, felt differently and signed the Budget and Accounting Act of 1921. The act gave the president first-mover advantage in the budget process via the first “executive budget.” It also created the first-ever budget staff at the disposal of a president, at the time called the Bureau of the Budget but decades later renamed the Office of Management and Budget (Figure 7.5). With this act, Congress willingly delegated significant authority to the executive and made the president the chief budget agenda setter.
Figure 7.5 In December 1936, the House Appropriations Committee hears Secretary of Treasury Henry Morgenthau, Jr. (bottom, left) and Acting Director of the Budget Daniel Bell (top, right) on the federal finances. (credit: modification of work by the Library of Congress)
The Budget Act of 1921 effectively shifted some congressional powers to the president. Why might Congress have felt it important to centralize the budgeting process in the executive branch? What advantages could the executive branch have over the legislative branch in this regard?
The growth of presidential power is also attributable to the growth of the United States and the power of the national government. As the nation has grown and developed, so has the office. Whereas most important decisions were once made at the state and local levels, the increasing complexity and size of the domestic economy have led people in the United States to look to the federal government more often for solutions. At the same time, the rising profile of the United States on the international stage has meant that the president is a far more important figure as leader of the nation, as diplomat-in-chief, and as commander-in-chief. Finally, with the rise of electronic mass media, a president who once depended on newspapers and official documents to distribute information beyond an immediate audience can now bring that message directly to the people via radio, television, and social media. Major events and crises, such as the Great Depression, two world wars, the Cold War, and the war on terrorism, have further contributed to presidential stature.
7.2 The Presidential Election Process
Learning Objectives
By the end of this section, you will be able to:
- Describe changes over time in the way the president and vice president are selected
- Identify the stages in the modern presidential selection process
- Assess the advantages and disadvantages of the Electoral College
The process of electing a president every four years has evolved over time. This evolution has resulted from attempts to correct the cumbersome procedures first offered by the framers of the Constitution and as a result of political parties’ rising power to act as gatekeepers to the presidency. Over the last several decades, the manner by which parties have chosen candidates has trended away from congressional caucuses and conventions and towards a drawn-out series of state contests, called primaries and caucuses, which begin in the winter prior to the November general election.
SELECTING THE CANDIDATE: THE PARTY PROCESS
The framers of the Constitution made no provision in the document for the establishment of political parties. Indeed, parties were not necessary to select the first president, since George Washington ran unopposed. Following the first election of Washington, the political party system gained steam and power in the electoral process, creating separate nomination and general election stages. Early on, the power to nominate presidents for office bubbled up from the party operatives in the various state legislatures and toward what was known as the king caucus or congressional caucus. The caucus or large-scale gathering was made up of legislators in the Congress who met informally to decide on nominees from their respective parties. In somewhat of a countervailing trend in the general election stage of the process, by the presidential election of 1824, many states were using popular elections to choose their electors. This became important in that election when Andrew Jackson won the popular vote and the largest number of electors, but the presidency was given to John Quincy Adams instead. Out of the frustration of Jackson’s supporters emerged a powerful two-party system that took control of the selection process.19
In the decades that followed, party organizations, party leaders, and workers met in national conventions to choose their nominees, sometimes after long struggles that took place over multiple ballots. In this way, the political parties kept a tight control on the selection of a candidate. In the early twentieth century, however, some states began to hold primaries, elections in which candidates vied for the support of state delegations to the party’s nominating convention. Over the course of the century, the primaries gradually became a far more important part of the process, though the party leadership still controlled the route to nomination through the convention system. This has changed in recent decades, and now a majority of the delegates are chosen through primary elections, and the party conventions themselves are little more than a widely publicized rubber-stamping event.
The rise of the presidential primary and caucus system as the main means by which presidential candidates are selected has had a number of anticipated and unanticipated consequences. For one, the campaign season has grown longer and more costly. In 1960, John F. Kennedy declared his intention to run for the presidency just eleven months before the general election. Compare this to Hillary Clinton, who announced her intention to run nearly two years before the 2008 general election. Today’s long campaign seasons are seasoned with a seemingly ever-increasing number of debates among contenders for the nomination. In 2016, when the number of candidates for the Republican nomination became large and unwieldy, two debates among them were held, in which only those candidates polling greater support were allowed in the more important prime-time debate. The runners-up spoke in the other debate. In 2020, it was the Democratic party that had a large field that required staggered debates, before the field narrowed and ultimately led to the nomination of former vice president Joe Biden, who would go on to choose fellow campaigner Kamala Harris as his running mate.
Finally, the process of going straight to the people through primaries and caucuses has created some opportunities for party outsiders to rise. Neither Ronald Reagan nor Bill Clinton was especially popular with the party leadership of the Republicans or the Democrats (respectively) at the outset. The outsider phenomenon has been most clearly demonstrated, however, in the 2016 presidential nominating process, as those distrusted by the party establishment, such as Senator Ted Cruz and Donald Trump, who never before held political office, raced ahead of party favorites like Jeb Bush early in the primary process (Figure 7.6).
Figure 7.6 Senator Ted Cruz (R-TX), though disliked by the party establishment, was able to rise to the top in the Iowa caucuses in 2016 because of his ability to reach the conservative base of the party. Ultimately, Cruz bowed out of the race when Donald Trump effectively clinched the Republican nomination in Indiana in early May 2016. (credit: Michael Vadon)
The rise of the primary system during the Progressive Era came at the cost of party regulars’ control of the process of candidate selection. Some party primaries even allow registered independents or members of the opposite party to vote. Even so, the process tends to attract the party faithful at the expense of independent voters, who often hold the key to victory in the fall contest. Thus, candidates who want to succeed in the primary contests seek to align themselves with committed partisans, who are often at the ideological extreme. Those who survive the primaries in this way have to moderate their image as they enter the general election if they hope to succeed among the rest of the party adherents and the uncommitted.
Primaries offer tests of candidates’ popular appeal, while state caucuses testify to their ability to mobilize and organize grassroots support among committed followers. Primaries also reward candidates in different ways, with some giving the winner all the state’s convention delegates, while others distribute delegates proportionately according to the distribution of voter support. Finally, the order in which the primary elections and caucus selections are held shape the overall race.20 Currently, the Iowa caucuses and the New Hampshire primary occur first. These early contests tend to shrink the field as candidates who perform poorly leave the race. At other times in the campaign process, some states will maximize their impact on the race by holding their primaries on the same day that other states do. The media has dubbed these critical groupings “Super Tuesdays,” “Super Saturdays,” and so on. They tend to occur later in the nominating process as parties try to force the voters to coalesce around a single nominee.
The rise of the primary has also displaced the convention itself as the place where party regulars choose their standard bearer. Once true contests in which party leaders fought it out to elect a candidate, by the 1970s, party conventions more often than not simply served to rubber-stamp the choice of the primaries. By the 1980s, the convention drama was gone, replaced by a long, televised commercial designed to extol the party’s greatness (Figure 7.7). Without the drama and uncertainty, major news outlets have steadily curtailed their coverage of the conventions, convinced that few people are interested. The 2016 elections seem to support the idea that the primary process produces a nominee rather than party insiders. Outsiders Donald Trump on the Republican side and Senator Bernie Sanders on the Democratic side had much success despite significant concerns about them from party elites. Whether this pattern could be reversed in the case of a closely contested selection process remains to be seen.
Figure 7.7 Traditional party conventions, like the Republican national convention in 1964 pictured here, could be contentious meetings at which the delegates made real decisions about who would run. These days, party conventions are little more than long promotional events. (credit: the Library of Congress)
ELECTING THE PRESIDENT: THE GENERAL ELECTION
Early presidential elections, conducted along the lines of the original process outlined in the Constitution, proved unsatisfactory. So long as George Washington was a candidate, his election was a foregone conclusion. But it took some manipulation of the votes of electors to ensure that the second-place winner (and thus the vice president) did not receive the same number of votes. When Washington declined to run again after two terms, matters worsened. In 1796, political rivals John Adams and Thomas Jefferson were elected president and vice president, respectively. Yet the two men failed to work well together during Adams’s administration, much of which Jefferson spent at his Virginia residence at Monticello. As noted earlier in this chapter, the shortcomings of the system became painfully evident in 1800, when Jefferson and his running mate Aaron Burr finished tied, thus leaving it to the House of Representatives to elect Jefferson.21
The Twelfth Amendment, ratified in 1804, provided for the separate election of president and vice president as well as setting out ways to choose a winner if no one received a majority of the electoral votes. Only once since the passage of the Twelfth Amendment, during the election of 1824, has the House selected the president under these rules, and only once, in 1836, has the Senate chosen the vice president. In several elections, such as in 1876 and 1888, a candidate who received less than a majority of the popular vote has claimed the presidency, including cases when the losing candidate secured a majority of the popular vote. A recent case was the 2000 election, in which Democratic nominee Al Gore won the popular vote, while Republican nominee George W. Bush won the Electoral College vote and hence the presidency. The 2016 election brought another such irregularity as Donald Trump comfortably won the Electoral College by narrowly winning the popular vote in several states, while Hillary Clinton collected nearly 2.9 million more votes nationwide.
Not everyone is satisfied with how the Electoral College fundamentally shapes the election, especially in cases such as those noted above, when a candidate with a minority of the popular vote claims victory over a candidate who drew more popular support. Yet movements for electoral reform, including proposals for a straightforward nationwide direct election by popular vote, have gained little traction.
Supporters of the current system defend it as a manifestation of federalism, arguing that it also guards against the chaos inherent in a multiparty environment by encouraging the current two-party system. They point out that under a system of direct election, candidates would focus their efforts on more populous regions and ignore others.22 Critics, on the other hand, charge that the current system negates the one-person, one-vote basis of U.S. elections, subverts majority rule, works against political participation in states deemed safe for one party, and might lead to chaos should an elector desert a candidate, thus thwarting the popular will. Despite all this, the system remains in place. It appears that many people are more comfortable with the problems of a flawed system than with the uncertainty of change.23
Electoral College Reform
Following the 2000 presidential election, when then-governor George W. Bush won by a single electoral vote and with over half a million fewer individual votes than his challenger, astonished voters called for Electoral College reform. Years later, however, nothing of any significance had been done. The absence of reform in the wake of such a problematic election is a testament to the staying power of the Electoral College. The 2016 election results were even more disparate. While in 2000, Al Gore won a narrow victory in the popular vote with Bush prevailing by one vote in the Electoral College, in 2016, Clinton won the popular vote by a margin of almost 3 million votes, while Trump won the Electoral College comfortably. In 2020, the results aligned, with Joe Biden winning the popular vote and Electoral College by comfortable margins, although several battleground states were very close.
Those who insist that the Electoral College should be reformed argue that its potential benefits pale in comparison to the way the Electoral College depresses voter turnout and fails to represent the popular will. In addition to favoring small states, since individual votes there count more than in larger states due to the mathematics involved in the distribution of electors, the Electoral College results in a significant number of “safe” states that receive no real electioneering, such that nearly 75 percent of the country is ignored in the general election.
One potential solution to the problems with the Electoral College is to scrap it all together and replace it with the popular vote. The popular vote would be the aggregated totals of the votes in the fifty states and District of Columbia, as certified by the head election official of each state. A second solution often mentioned is to make the Electoral College proportional. That is, as each state assigns it electoral votes, it would do so based on the popular vote percentage in their state, rather with the winner-take-all approach almost all the states use today.
A third alternative for Electoral College reform has been proposed by an organization called National Popular Vote. The National Popular Vote movement is an interstate compact between multiple states that sign onto the compact. Once a combination of states constituting 270 Electoral College votes supports the movement, each state entering the compact pledges all of its Electoral College votes to the national popular vote winner. This reform does not technically change the Electoral College structure, but it results in a mandated process that makes the Electoral College reflect the popular vote. Thus far, fifteen states and the District of Columbia with a total of 196 electoral votes among them have signed onto the compact.
In what ways does the current Electoral College system protect the representative power of small states and less densely populated regions? Why might it be important to preserve these protections?
Follow-up activity: View the National Popular Vote website to learn more about their position. Consider reaching out to them to learn more, offer your support, or even to argue against their proposal.
See how the Electoral College and the idea of swing states fundamentally shapes elections by experimenting with the interactive Electoral College map at 270 to Win.
The general election usually features a series of debates between the presidential contenders as well as a debate among vice presidential candidates. Because the stakes are high, quite a bit of money and resources are expended on all sides. Attempts to rein in the mounting costs of modern general-election campaigns have proven ineffective. Nor has public funding helped to solve the problem. Indeed, starting with Barack Obama’s 2008 decision to forfeit public funding so as to skirt the spending limitations imposed, candidates now regularly opt to raise more money rather than to take public funding.24 In addition, political action committees (PACs), supposedly focused on issues rather than specific candidates, seek to influence the outcome of the race by supporting or opposing a candidate according to the PAC’s own interests. But after all the spending and debating is done, those who have not already voted by other means set out on the first Tuesday following the first Monday in November to cast their votes. Several weeks later, the electoral votes are counted and the president is formally elected (Figure 7.8).
Figure 7.8 The process of becoming president has become an increasingly longer one, but the underlying steps remain largely the same. (credit: modification of work by the U. S. General Services Administration, Federal Citizen Information Center, Ifrah Syed)
7.3 Organizing to Govern
Learning Objectives
By the end of this section, you will be able to:
- Explain how incoming and outgoing presidents peacefully transfer power
- Describe how new presidents fill positions in the executive branch
- Discuss how incoming presidents use their early popularity to advance larger policy solutions
It is one thing to win an election; it is quite another to govern, as many frustrated presidents have discovered. Critical to a president’s success in office is the ability to make a deft transition from the previous administration, including naming a cabinet and filling other offices. The new chief executive must also fashion an agenda, which they will often preview in general terms in an inaugural address. Presidents usually embark upon their presidency benefitting from their own and the nation’s renewed hope and optimism, although often unrealistic expectations set the stage for subsequent disappointment.
TRANSITION AND APPOINTMENTS
In the immediate aftermath of the election, the incoming and outgoing administrations work together to help facilitate the transfer of power. While the General Services Administration oversees the logistics of the process, such as office assignments, information technology, and the assignment of keys, prudent candidates typically prepare for a possible victory by appointing members of a transition team during the lead-up to the general election. The success of the team’s actions becomes apparent on inauguration day, when the transition of power takes place in what is often a seamless fashion, with people evacuating their offices (and the White House) for their successors.
Read about presidential transitions as well as explore other topics related to the transfer of power at the White House Transition Project website.
Among the president-elect’s more important tasks is the selection of a cabinet. George Washington’s cabinet was made up of only four people, the attorney general and the secretaries of the Departments of War, State, and the Treasury. Currently, however, there are fifteen members of the cabinet, including the Secretaries of Labor, Agriculture, Education, and others (Figure 7.9). The most important members—the heads of the Departments of Defense, Justice, State, and the Treasury (echoing Washington’s original cabinet)—receive the most attention from the president, the Congress, and the media. These four departments have been referred to as the inner cabinet, while the others are called the outer cabinet. When selecting a cabinet, presidents consider ability, expertise, influence, and reputation. More recently, presidents have also tried to balance political and demographic representation (gender, race, religion, and other considerations) to produce a cabinet that is capable as well as descriptively representative, meaning that those in the cabinet look like the U.S. population (see the chapter on bureaucracy and the term “representative bureaucracy”). A recent president who explicitly stated this as his goal was Bill Clinton, who talked about an “E.G.G. strategy” for senior-level appointments, where the E stands for ethnicity, G for gender, and the second G for geography.
Figure 7.9 President Joe Biden and Vice President Kamala Harris pose with the Presidential Cabinet on April 1, 2021, in the Grand Foyer of the White House. Seated directly behind the president and vice president are (from left to right) Treasury Secretary Janet Yellen, Secretary of State Antony Blinken, and Defense Secretary Lloyd Austin. In addition to being historically diverse, Biden's Cabinet has more government experience than his predecessors, with more than 95 percent of the Cabinet having prior government experience. (credit: “Cabinet of President Joe Biden in April 2021” by Adam Schultz, The White House/Wikimedia Commons, Public Domain)
Once the new president has been inaugurated and can officially nominate people to fill cabinet positions, the Senate confirms or rejects these nominations. At times, though rarely, cabinet nominations have failed to be confirmed or have even been withdrawn because of questions raised about the past behavior of the nominee.25 Prominent examples of such failures were Senator John Tower for defense secretary (George H. W. Bush) and Zoe Baird for attorney general (Bill Clinton): Senator Tower’s indiscretions involving alcohol and womanizing led to concerns about his fitness to head the military and his rejection by the Senate,26 whereas Zoe Baird faced controversy and withdrew her nomination when it was revealed, through what the press dubbed “Nannygate,” that house staff of hers were undocumented workers. These two cases are emblematic of a change in how presidential nominations fail in the Senate. Failures used to involve outright rejections in committee votes or floor votes, like the Tower case. More recently, failures typically die of inattention. However, these cases are rare exceptions to the rule, which is to give approval to the nominees that the president wishes to have in the cabinet. Other possible candidates for cabinet posts may decline to be considered for a number of reasons, from the reduction in pay that can accompany entrance into public life to unwillingness to be subjected to the vetting process that accompanies a nomination.
Also subject to Senate approval are a number of non-cabinet subordinate administrators in the various departments of the executive branch, as well as the administrative heads of several agencies and commissions. These include the heads of the Internal Revenue Service, the Central Intelligence Agency, the Office of Management and Budget, the Federal Reserve, the Social Security Administration, the Environmental Protection Agency, the National Labor Relations Board, and the Equal Employment Opportunity Commission. The Office of Management and Budget (OMB) is the president’s own budget department. In addition to preparing the executive budget proposal and overseeing budgetary implementation during the federal fiscal year, the OMB oversees the actions of the executive bureaucracy.
Not all the non-cabinet positions are open at the beginning of an administration, but presidents move quickly to install their preferred choices in most roles when given the opportunity. Finally, new presidents usually take the opportunity to nominate new ambassadors, whose appointments are subject to Senate confirmation. New presidents make thousands of new appointments in their first two years in office. All the senior cabinet agency positions and nominees for all positions in the Executive Office of the President are made as presidents enter office or when positions become vacant during their presidency. Federal judges serve for life. Therefore, vacancies for the federal courts and the U.S. Supreme Court occur gradually as judges retire.
Throughout much of the history of the republic, the Senate has closely guarded its constitutional duty to consent to the president’s nominees, although in the end it nearly always confirms them. Still, the Senate does occasionally hold up a nominee. Benjamin Fishbourn, President George Washington’s nomination for a minor naval post, was rejected largely because he had insulted a particular senator.27 Other rejected nominees included Clement Haynsworth and G. Harrold Carswell, nominated for the U.S. Supreme Court by President Nixon; Theodore Sorensen, nominated by President Carter for director of the Central Intelligence Agency; and John Tower, discussed earlier. At other times, the Senate has used its power to rigorously scrutinize the president’s nominees (Figure 7.10). Supreme Court nominee Clarence Thomas, who faced numerous sexual harassment charges from former employees, was forced to sit through repeated questioning of his character and past behavior during Senate hearings, something he referred to as “a high-tech lynching for uppity Blacks.”28
Figure 7.10 In 2013, President Barack Obama nominated former Republican senator Chuck Hagel to run the Department of Defense. The president hoped that by nominating a former senator from the opposition he could ensure the confirmation process would go smoothly. Instead, however, Senator Ted Cruz used the confirmation hearing to question the Vietnam War hero’s patriotism. Hagel was eventually confirmed by a 58–41 vote. (credit: Leon E. Panetta)
More recently, the Senate has attempted a new strategy, refusing to hold hearings at all, a strategy of defeat that scholars have referred to as “malign neglect.”29 Despite the fact that one-third of U.S. presidents have appointed a Supreme Court justice in an election year, when Associate Justice Antonin Scalia died unexpectedly in early 2016, Senate majority leader Mitch McConnell declared that the Senate would not hold hearings on a nominee until after the upcoming presidential election.30 McConnell remained adamant even after President Barack Obama, saying he was acting in fulfillment of his constitutional duty, nominated Merrick Garland, longtime chief judge of the federal Circuit Court of Appeals for the DC Circuit. Garland was highly respected by senators from both parties and had won confirmation to his DC circuit position by a 76–23 vote in the Senate. When Republican Donald Trump was elected president in the fall, this strategy appeared to pay off. The Republican Senate and Judiciary Committee confirmed Trump's nominee, Neil Gorsuch, in April 2017, exercising the so-called "nuclear option," which allowed Republicans to break the Democrats' filibuster of the nomination by a simple majority vote. Ultimately, Senator McConnell reversed his "proximity to the next election" explanation for waiting to fill a Supreme Court vacancy when Justice Ruth Bader Ginsburg passed away just prior to the 2020 election and McConnell and the Republicans quickly processed and confirmed Justice Amy Coney Barrett.
Other presidential selections are not subject to Senate approval, including the president’s personal staff (whose most important member is the White House chief of staff) and various advisers (most notably the national security adviser). The Executive Office of the President, created by Franklin D. Roosevelt (FDR), contains a number of advisory bodies, including the Council of Economic Advisers, the National Security Council, the OMB, and the Office of the Vice President. Presidents also choose political advisers, speechwriters, and a press secretary to manage the politics and the message of the administration. In recent years, the president’s staff has become identified by the name of the place where many of its members work: the West Wing of the White House. These people serve at the pleasure of the president, and often the president reshuffles or reforms the staff during the term. Just as government bureaucracy has expanded over the centuries, so has the White House staff, which under Abraham Lincoln numbered a handful of private secretaries and a few minor functionaries. A recent report pegged the number of employees working within the White House over 450.31 When the staff in nearby executive buildings of the Executive Office of the President are added in, that number increases four-fold.
No Fun at Recess: Dueling Loopholes and the Limits of Presidential Appointments
When Supreme Court justice Antonin Scalia died unexpectedly in early 2016, many in Washington braced for a political sandstorm of obstruction and accusations. Such was the record of Supreme Court nominations during the Obama administration and, indeed, for the last few decades. Nor is this phenomenon restricted to nominations for the highest court in the land. The Senate has been known to occasionally block or slow appointments not because the quality of the nominee was in question but rather as a general protest against the policies of the president and/or as part of the increasing partisan bickering that occurs when the presidency is controlled by one political party and the Senate by the other. This occurred, for example, when the Senate initially refused to nominate anyone to head the Consumer Financial Protection Bureau, established in 2011, because Republicans disliked the existence of the bureau itself.
Such political holdups, however, tend to be the exception rather than the rule. For example, historically, nominees to the presidential cabinet are rarely rejected. And each Congress oversees the approval of around four thousand civilian and sixty-five thousand military appointments from the executive branch.32 The overwhelming majority of these are confirmed in a routine and systematic fashion, and only rarely do holdups occur. But when they do, the Constitution allows for a small presidential loophole called the recess appointment. The relevant part of Article II, Section 2, of the Constitution reads:
“The President shall have Power to fill up all Vacancies that may happen during the Recess of the Senate, by granting Commissions which shall expire at the End of their next Session.”
The purpose of the provision was to give the president the power to temporarily fill vacancies during times when the Senate was not in session and could not act. But presidents have typically used this loophole to get around a Senate that’s inclined to obstruct. Presidents Bill Clinton and George W. Bush made 139 and 171 recess appointments, respectively. President Obama made far fewer recess appointments, with a total of only thirty-two during his presidency.33 One reason this number is so low is another loophole the Senate began using at the end of George W. Bush’s presidency, the pro forma session.
A pro forma session is a short meeting held with the understanding that no work will be done. These sessions have the effect of keeping the Senate officially in session while functionally in recess. In 2012, President Obama decided to ignore the pro forma session and make four recess appointments anyway. The Republicans in the Senate were furious and contested the appointments. Eventually, the Supreme Court had the final say in a 2014 decision that declared unequivocally that “the Senate is in session when it says it is.”34 For now at least, the court’s ruling means that the president’s loophole and the Senate’s loophole cancel each other out. It seems they’ve found the middle ground whether they like it or not.
What might have been the legitimate original purpose of the recess appointment loophole? Do you believe the Senate is unfairly obstructing by effectively ending recesses altogether so as to prevent the president from making appointments without its approval?
The most visible, though arguably the least powerful, member of a president’s cabinet is the vice president. Throughout most of the nineteenth and into the twentieth century, the vast majority of vice presidents took very little action in the office unless fate intervened. Few presidents consulted with their running mates. Indeed, until the twentieth century, many presidents had little to do with the naming of their running mate at the nominating convention. The office was seen as a form of political exile, and that motivated Republicans to name Theodore Roosevelt as William McKinley’s running mate in 1900. The strategy was to get the ambitious politician out of the way while still taking advantage of his popularity. This scheme backfired, however, when McKinley was assassinated and Roosevelt became president (Figure 7.11).
Figure 7.11 In September 1901, President William McKinley’s assassination, shown here in a sketch by T. Dart Walker (a), made forty-two-year-old vice president Theodore Roosevelt (b) the youngest person to ever assume the office of U.S. president.
Vice presidents were often sent on minor missions or used as mouthpieces for the administration, often with a sharp edge. Richard Nixon’s vice president Spiro Agnew is an example. But in the 1970s, starting with Jimmy Carter, presidents made a far more conscious effort to make their vice presidents part of the governing team, placing them in charge of increasingly important issues. Sometimes, as in the case of Bill Clinton and Al Gore, the partnership appeared to be smooth if not always harmonious. In the case of George W. Bush and his very experienced vice president Dick Cheney, observers speculated whether the vice president might have exercised too much influence. Barack Obama’s choice for a running mate and subsequent two-term vice president, former Senator Joseph Biden, was picked for his experience, especially in foreign policy. President Obama relied on Vice President Biden for advice throughout his tenure. President Trump relied on Vice President Mike Pence to lead initiatives on health care reform and COVID-19, and Pence would gather West Wing officials and Cabinet members together, when Trump was occupied with other matters, in a manner atypical for a vice president. President Joe Biden involves Vice President Kamala Harris in every important policy discussion and has charged her with leading discussion of border control matters Figure 7.12. In any case, the vice presidency is no longer quite as weak as it once was, and a capable vice president can do much to augment the president’s capacity to govern across issues if the president so desires.35
Figure 7.12 Vice President Kamala Harris speaks to State Department employees in Washington, DC on February 4, 2021. Vice President Harris continues a strong trend of vice presidents doing important and substantive work alongside the president. (credit: "Vice President Harris Delivers Remarks to State Department Employees" by U.S. Department of State/Wikimedia Commons, Public Domain)
FORGING AN AGENDA
Having secured election, the incoming president must soon decide how to deliver upon what was promised during the campaign. The chief executive must set priorities, chose what to emphasize, and formulate strategies to get the job done. He or she labors under the shadow of a measure of presidential effectiveness known as the first hundred days in office, a concept popularized during Franklin Roosevelt’s first term in the 1930s. While one hundred days is possibly too short a time for any president to boast of any real accomplishments, most presidents do recognize that they must address their major initiatives during their first two years in office. This is the time when the president is most powerful and is given the benefit of the doubt by the public and the media (aptly called the honeymoon period), especially if entering the White House with a politically aligned Congress, as Barack Obama did. However, recent history suggests that even one-party control of Congress and the presidency does not ensure efficient policymaking. This difficulty is due as much to divisions within the governing party as to obstructionist tactics skillfully practiced by the minority party in Congress. Democratic president Jimmy Carter’s battles with a Congress controlled by Democratic majorities provide a good case in point.
The incoming president must deal to some extent with the outgoing president’s last budget proposal. While some modifications can be made, it is more difficult to pursue new initiatives immediately. Most presidents are well advised to prioritize what they want to achieve during the first year in office and not lose control of their agenda. At times, however, unanticipated events can determine policy, as happened in 2001 when nineteen hijackers perpetrated the worst terrorist attack in U.S. history and transformed U.S. foreign and domestic policy in dramatic ways.
Moreover, presidents must be sensitive to what some scholars have termed “political time,” meaning the circumstances under which they assume power. Sometimes, the nation is prepared for drastic proposals to solve deep and pressing problems that cry out for immediate solutions, as was the case following the 1932 election of FDR at the height of the Great Depression. Most times, however, the country is far less inclined to accept revolutionary change. Being an effective president means recognizing the difference.36
The first act undertaken by the new president—the delivery of an inaugural address—can do much to set the tone for what is intended to follow. While such an address may be an exercise in rhetorical inspiration, it also allows the president to set forth priorities within the overarching vision of what they intend to do. Abraham Lincoln used his inaugural addresses to calm rising concerns in the South that he would act to overturn slavery. Unfortunately, this attempt at appeasement fell on deaf ears, and the country descended into civil war. Franklin Roosevelt used his first inaugural address to boldly proclaim that the country need not fear the change that would deliver it from the grip of the Great Depression, and he set to work immediately enlarging the federal government to that end. John F. Kennedy, who entered the White House at the height of the Cold War, made an appeal to talented young people around the country to help him make the world a better place. He followed up with new institutions like the Peace Corps, which sends young citizens around the world to work as secular missionaries for American values like democracy and free enterprise.
Listen to clips of the most famous inaugural address in presidential history at the Washington Post website.
7.4 The Public Presidency
Learning Objectives
By the end of this section, you will be able to:
- Explain how technological innovations have empowered presidents
- Identify ways in which presidents appeal to the public for approval
- Explain how the role of first ladies changed over the course of the twentieth century
With the advent of motion picture newsreels and voice recordings in the 1920s, presidents began to broadcast their message to the general public. Franklin Roosevelt, while not the first president to use the radio, adopted this technology to great effect. Over time, as radio gave way to newer and more powerful technologies like television, the Internet, and social media, other presidents have been able magnify their voices to an even-larger degree. Presidents now have far more tools at their disposal to shape public opinion and build support for policies. However, the choice to “go public” does not always lead to political success; it is difficult to convert popularity in public opinion polls into political power. Moreover, the modern era of information and social media empowers opponents at the same time that it provides opportunities for presidents.
THE SHAPING OF THE MODERN PRESIDENCY
From the days of the early republic through the end of the nineteenth century, presidents were limited in the ways they could reach the public to convey their perspective and shape policy. Inaugural addresses and messages to Congress, while circulated in newspapers, proved clumsy devices to attract support, even when a president used plain, blunt language. Some presidents undertook tours of the nation, notably George Washington and Rutherford B. Hayes. Others promoted good relationships with newspaper editors and reporters, sometimes going so far as to sanction a pro-administration newspaper. One president, Ulysses S. Grant, cultivated political cartoonist Thomas Nast to present the president’s perspective in the pages of the magazine Harper’s Weekly.37 Abraham Lincoln experimented with public meetings recorded by newspaper reporters and public letters that would appear in the press, sometimes after being read at public gatherings (Figure 7.13). Most presidents gave speeches, although few proved to have much immediate impact, including Lincoln’s memorable Gettysburg Address.
Figure 7.13 While President Abraham Lincoln was not the first president to be photographed, he was the first to use the relatively new power of photography to enhance his power as president and commander-in-chief. Here, Lincoln poses with Union soldiers (a) during his visit to Antietam, Maryland, on October 3, 1862. President Ulysses S. Grant cultivated a relationship with popular cartoonist Thomas Nast, who often depicted the president in the company of “Lady Liberty” (b) in addition to relentlessly attacking his opponent Horace Greeley.
Rather, most presidents exercised the power of patronage (or appointing people who are loyal and help them out politically) and private deal-making to get what they wanted at a time when Congress usually held the upper hand in such transactions. But even that presidential power began to decline with the emergence of civil service reform in the later nineteenth century, which led to most government officials being hired on their merit instead of through patronage. Only when it came to diplomacy and war were presidents able to exercise authority on their own, and even then, institutional as well as political restraints limited their independence of action.
Theodore Roosevelt came to the presidency in 1901, at a time when movie newsreels were becoming popular. Roosevelt, who had always excelled at cultivating good relationships with the print media, eagerly exploited this new opportunity as he took his case to the people with the concept of the presidency as bully pulpit, a platform from which to push his agenda to the public. His successors followed suit, and they discovered and employed new ways of transmitting their message to the people in an effort to gain public support for policy initiatives. With the popularization of radio in the early twentieth century, it became possible to broadcast the president’s voice into many of the nation’s homes. Most famously, FDR used the radio to broadcast his thirty “fireside chats” to the nation between 1933 and 1944.
In the post–World War II era, television began to replace radio as the medium through which presidents reached the public. This technology enhanced the reach of the handsome young president John F. Kennedy and the trained actor Ronald Reagan. At the turn of the twentieth century, the new technology was the Internet. The extent to which this mass media technology can enhance the power and reach of the president has yet to be fully realized. In the twenty-first century, presidents face a paradox. While there are more ways than ever to get their message out, be it television channels or social media networks, the complexity of modern media makes the prospects for presidents of directly reaching the public less certain. Former president Donald Trump took going public to the extreme, some days sending dozens of tweets to both promote his agenda and attack political opponents. Even his allies and senior officials would be surprised by some of the tweets.
Other presidents have used advances in transportation to take their case to the people. Woodrow Wilson traveled the country to advocate formation of the League of Nations. However, he fell short of his goal when he suffered a stroke in 1919 and cut his tour short. Both Franklin Roosevelt in the 1930s and 1940s and Harry S. Truman in the 1940s and 1950s used air travel to conduct diplomatic and military business. Under President Dwight D. Eisenhower, a specific plane, commonly called Air Force One, began carrying the president around the country and the world. This gives the president the ability to take their message directly to the far corners of the nation at any time.
GOING PUBLIC: PROMISE AND PITFALLS
The concept of going public involves the president delivering a major television address in the hope that Americans watching the address will be compelled to contact their House and Senate member and that such public pressure will result in the legislators supporting the president on a major piece of legislation. Technological advances have made it more efficient for presidents to take their messages directly to the people than was the case before mass media (Figure 7.14). Presidential visits can build support for policy initiatives or serve political purposes, helping the president reward supporters, campaign for candidates, and seek reelection. It remains an open question, however, whether choosing to go public actually enhances a president’s political position in battles with Congress. Political scientist George C. Edwards goes so far as to argue that taking a president’s position public serves to polarize political debate, increase public opposition to the president, and complicate the chances to get something done. It replaces deliberation and compromise with confrontation and campaigning. Edwards believes the best way for presidents to achieve change is to keep issues private and negotiate resolutions that preclude partisan combat. Going public may be more effective in rallying supporters than in gaining additional support or changing minds.38
Figure 7.14 With the advent of video technology and cable television, the power of the president to reach huge audiences increased exponentially. President Ronald Reagan, shown here giving one of his most famous speeches in Berlin, was an expert at using technology to help mold and project his presidential image to the public. His training as an actor certainly helped in this regard.
Today, it is possible for the White House to take its case directly to the people via websites like White House Live, where the public can watch live press briefings and speeches.
THE FIRST LADY: A SECRET WEAPON?
The president is not the only member of the First Family who often attempts to advance an agenda by going public. First ladies increasingly exploited the opportunity to gain public support for an issue of deep interest to them. Before 1933, most first ladies served as private political advisers to their husbands. In the 1910s, Edith Bolling Wilson took a more active but still private role assisting her husband, President Woodrow Wilson, afflicted by a stroke, in the last years of his presidency. However, as the niece of one president and the wife of another, it was Eleanor Roosevelt in the 1930s and 1940s who opened the door for first ladies to do something more.
Eleanor Roosevelt took an active role in championing civil rights, becoming in some ways a bridge between her husband and the civil rights movement. She coordinated meetings between FDR and members of the NAACP, championed antilynching legislation, openly defied segregation laws, and pushed the Army Nurse Corps to allow Black women in its ranks. She also wrote a newspaper column and had a weekly radio show. Her immediate successors returned to the less visible role held by her predecessors, although in the early 1960s, Jacqueline Kennedy gained attention for her efforts to refurbish the White House along historical lines, and Lady Bird Johnson in the mid- and late 1960s endorsed an effort to beautify public spaces and highways in the United States. She also established the foundations of what came to be known as the Office of the First Lady, complete with a news reporter, Liz Carpenter, as her press secretary.
Betty Ford took over as first lady in 1974 and became an avid advocate of women’s rights, proclaiming that she was pro-choice when it came to abortion and lobbying for the ratification of the Equal Rights Amendment (ERA). She shared with the public the news of her breast cancer diagnosis and subsequent mastectomy. Her successor, Rosalynn Carter, attended several cabinet meetings and pushed for the ratification of the ERA as well as for legislation addressing mental health issues (Figure 7.15).
Figure 7.15 On November 19, 1977, Rosalynn Carter (center left) and Betty Ford (center right) attended a rally in favor of the passage of the Equal Rights Amendment.
The increasing public political role of the first lady continued in the 1980s with Nancy Reagan’s “Just Say No” antidrug campaign and in the early 1990s with Barbara Bush’s efforts on behalf of literacy. The public role of the first lady reach a new level with Hillary Clinton in the 1990s when her husband put her in charge of his efforts to achieve health care reform, a controversial decision that did not meet with political success. Her successors, Laura Bush in the first decade of the twenty-first century and Michelle Obama in the second, returned to the roles played by predecessors in advocating less controversial policies: Laura Bush advocated literacy and education, while Michelle Obama has emphasized physical fitness and healthy diet and exercise. Nevertheless, the public and political profiles of first ladies remain high, and in the future, the president’s spouse will have the opportunity to use that unelected position to advance policies that might well be less controversial and more appealing than those pushed by the president.
A New Role for the First Lady?
While running for the presidency for the first time in 1992, Bill Clinton frequently touted the experience and capabilities of his wife. There was a lot to brag about. Hillary Rodham Clinton was a graduate of Yale Law School, had worked as a member of the impeachment inquiry staff during the height of the Watergate scandal in Nixon’s administration, and had been a staff attorney for the Children’s Defense Fund before becoming the first lady of Arkansas. Acknowledging these qualifications, candidate Bill Clinton once suggested that by electing him, voters would get “two for the price of one.” The clear implication in this statement was that his wife would take on a far larger role than previous first ladies, and this proved to be the case.39
Shortly after taking office, Clinton appointed the first lady to chair the Task Force on National Health Care Reform. This organization was to follow through on his campaign promise to fix the problems in the U.S. healthcare system. Hillary Clinton had privately requested the appointment, but she quickly realized that the complex web of business interests and political aspirations combined to make the topic of health care reform a hornet’s nest. This put the Clinton administration’s first lady directly into partisan battles few if any previous first ladies had ever faced.
As a testament to both the large role the first lady had taken on and the extent to which she had become the target of political attacks, the recommendations of the task force were soon dubbed “Hillarycare” by opponents. In a particularly contentious hearing in the House, the first lady and Republican representative Dick Armey exchanged pointed jabs with each other. At one point, Armey suggested that the reports of her charm were “overstated” after the first lady likened him to Dr. Jack Kevorkian, a physician known for helping patients commit suicide (Figure 7.16).40 The following summer, the first lady attempted to use a national bus tour to popularize the health care proposal, although distaste for her and for the program had reached such a fevered pitch that she sometimes was compelled to wear a bulletproof vest. In the end, the efforts came up short and the reform attempts were abandoned as a political failure. Nevertheless, Hillary Clinton remained a political lightning rod for the rest of the Clinton presidency.
Figure 7.16 Hillary Clinton during her presentation at a congressional hearing on health care reform in 1993. (credit: Library of Congress)
What do the challenges of First Lady Hillary Clinton’s foray into national politics suggest about the dangers of a first lady abandoning the traditionally safe nonpartisan goodwill efforts? What do the actions of the first ladies since Clinton suggest about the lessons learned or not learned?
7.5 Presidential Governance: Direct Presidential Action
Learning Objectives
By the end of this section, you will be able to:
- Identify the power presidents have to effect change without congressional cooperation
- Analyze how different circumstances influence the way presidents use unilateral authority
- Explain how presidents persuade others in the political system to support their initiatives
- Describe how historians and political scientists evaluate the effectiveness of a presidency
A president’s powers can be divided into two categories: direct actions the chief executive can take by employing the formal institutional powers of the office and informal powers of persuasion and negotiation essential to working with the legislative branch. When a president governs alone through direct action, it may break a policy deadlock or establish new grounds for action, but it may also spark opposition that might have been handled differently through negotiation and discussion. Moreover, such decisions are subject to court challenge, legislative reversal, or revocation by a successor. What may seem to be a sign of strength is often more properly understood as independent action undertaken in the wake of a failure to achieve a solution through the legislative process, or an admission that such an effort would prove futile. When it comes to national security, international negotiations, or war, the president has many more opportunities to act directly and in some cases must do so when circumstances require quick and decisive action.
DOMESTIC POLICY
Presidents may not be able to appoint key members of their administration without Senate confirmation, but they can demand the resignation or removal of cabinet officers, high-ranking appointees (such as ambassadors), and members of the presidential staff. During Reconstruction, Congress tried to curtail the president’s removal power with the Tenure of Office Act (1867), which required Senate concurrence to remove presidential nominees who took office upon Senate confirmation. Andrew Johnson’s violation of that legislation provided the grounds for his impeachment in 1868. Subsequent presidents secured modifications of the legislation before the Supreme Court ruled in 1926 that the Senate had no right to impair the president’s removal power.41 In the case of Senate failure to approve presidential nominations, the president is empowered to issue recess appointments (made while the Senate is in recess) that continue in force until the end of the next session of the Senate (unless the Senate confirms the nominee).
The president also exercises the power of pardon without conditions. Once used fairly sparingly—apart from Andrew Johnson’s wholesale pardons of former Confederates during the Reconstruction period—the pardon power has become more visible in recent decades. President Harry S. Truman issued over two thousand pardons and commutations, more than any other post–World War II president.42 President Gerald Ford has the unenviable reputation of being the only president to pardon another president (his predecessor Richard Nixon, who resigned after the Watergate scandal) (Figure 7.17). While not as generous as Truman, President Jimmy Carter also issued a great number of pardons, including several for draft dodging during the Vietnam War. President Reagan was reluctant to use the pardon as much, as was President George H. W. Bush. President Clinton pardoned few people for much of his presidency, but did make several last-minute pardons, which led to some controversy. By the end of his presidency, Barack Obama had granted 212 pardons, or 6 percent of petitions received, numbers similar to that of his predecessor, George W. Bush.43 Early on in his presidency, Donald Trump used the pardon in a few visible cases. He set aside sentences for controversial former Sherriff Joe Arpaio of Maricopa County, Arizona, and for former Vice President Dick Cheney’s confidante, Scooter Libby.44 Like other presidents, with his presidency's end in sight after losing the November 2020 election to Joe Biden, his use of pardons escalated. In the end, he granted 237 pardons.
Figure 7.17 In 1974, President Ford became the first and still the only president to pardon a previous president (Richard Nixon). Here he is speaking before the House Judiciary Subcommittee on Criminal Justice meeting explaining his reasons. While the pardon was unpopular with many and may have cost Ford the election two years later, his constitutional power to issue it is indisputable. (credit: modification of work by the Library of Congress)
Presidents may choose to issue executive orders or proclamations to achieve policy goals. Usually, executive orders direct government agencies to pursue a certain course in the absence of congressional action. A more subtle version pioneered by recent presidents is the executive memorandum, which tends to attract less attention. Many of the most famous executive orders have come in times of war or invoke the president’s authority as commander-in-chief, including Franklin Roosevelt’s order permitting the internment of Japanese Americans in 1942 and Harry Truman’s directive desegregating the armed forces (1948). The most famous presidential proclamation was Abraham Lincoln’s Emancipation Proclamation (1863), which declared enslaved people in areas under Confederate control to be free (with a few exceptions).
Executive orders are subject to court rulings or changes in policy enacted by Congress. During the Korean War, the Supreme Court revoked Truman’s order seizing the steel industry.45 These orders are also subject to reversal by presidents who come after, and recent presidents have wasted little time reversing the orders of their predecessors in cases of disagreement. Sustained executive orders, which are those not overturned in courts, typically have some prior authority from Congress that legitimizes them. When there is no prior authority, it is much more likely that an executive order will be overturned by a later president. For this reason, while there has been significant use of executive orders in recent years, including an increase in use with President Trump, the last several presidents have used them sparingly compared to presidents in the early twentieth century. (Figure 7.18).
Figure 7.18 Executive actions were unusual until the late nineteenth century. They became common in the first half of the twentieth century but have been growing less popular for the last few decades because they often get overturned in court if the Congress has not given the president prior delegated authority.
Executive Order 9066
Following the devastating Japanese attacks on the U.S. Pacific fleet at Pearl Harbor in 1941, many in the United States feared that Japanese Americans on the West Coast had the potential and inclination to form a fifth column (a hostile group working from the inside) for the purpose of aiding a Japanese invasion. These fears mingled with existing anti-Japanese sentiment across the country and created a paranoia that washed over the West Coast like a large wave. In an attempt to calm fears and prevent any real fifth-column actions, President Franklin D. Roosevelt signed Executive Order 9066, which authorized the removal of people from military areas as necessary. When the military dubbed the entire West Coast a military area, it effectively allowed for the removal of more than 110,000 Japanese Americans from their homes. These people, many of them U.S. citizens, were moved to relocation centers in the interior of the country. They lived in the camps there for two and a half years (Figure 7.19).46
Figure 7.19 This sign appeared outside a store in Oakland, California, owned by a Japanese American after the bombing of Pearl Harbor in 1941. After the president’s executive order, the store was closed and the owner evacuated to an internment camp for the duration of the war. (credit: the Library of Congress)
The overwhelming majority of Japanese Americans felt shamed by the actions of the Japanese empire and willingly went along with the policy in an attempt to demonstrate their loyalty to the United States. But at least one Japanese American refused to go along. His name was Fred Korematsu, and he decided to go into hiding in California rather than be taken to the internment camps with his family. He was soon discovered, turned over to the military, and sent to the internment camp in Utah that held his family. But his challenge to the internment system and the president’s executive order continued.
In 1944, Korematsu’s case was heard by the Supreme Court. In a 6–3 decision, the Court ruled against him, arguing that the administration had the constitutional power to sign the order because of the need to protect U.S. interests against the threat of espionage.47 Forty-four years after this decision, President Reagan issued an official apology for the internment and provided some compensation to the survivors. In 2011, the Justice Department went a step further by filing a notice officially recognizing that the solicitor general of the United States acted in error by arguing to uphold the executive order. (The solicitor general is the official who argues cases for the U.S. government before the Supreme Court.) However, despite these actions, in 2014, the late Supreme Court justice Antonin Scalia was documented as saying that while he believed the decision was wrong, it could occur again.48
What do the Korematsu case and the internment of over 100,000 Japanese Americans suggest about the extent of the president’s war powers? What does this episode in U.S. history suggest about the weaknesses of constitutional checks on executive power during times of war?
To learn more about the relocation and confinement of Japanese Americans during World War II, visit Heart Mountain online.
Finally, presidents have also used the line-item veto and signing statements to alter or influence the application of the laws they sign. A line-item veto is a type of veto that keeps the majority of a spending bill unaltered but nullifies certain lines of spending within it. While a number of states allow their governors the line-item veto (discussed in the chapter on state and local government), the president acquired this power only in 1996 after Congress passed a law permitting it. President Clinton used the tool sparingly. However, those entities that stood to receive the federal funding he lined out brought suit. Two such groups were the City of New York and the Snake River Potato Growers in Idaho.49 The Supreme Court heard their claims together and just sixteen months later declared unconstitutional the act that permitted the line-item veto.50 Since then, presidents have asked Congress to draft a line-item veto law that would be constitutional, although none have made it to the president’s desk.
On the other hand, signing statements are statements issued by a president when agreeing to legislation that indicate how the chief executive will interpret and enforce the legislation in question. Signing statements are less powerful than vetoes, though congressional opponents have complained that they derail legislative intent. Signing statements have been used by presidents since at least James Monroe, but they became far more common in this century.
NATIONAL SECURITY, FOREIGN POLICY, AND WAR
Presidents are more likely to justify the use of executive orders in cases of national security or as part of their war powers. In addition to mandating emancipation and the internment of Japanese Americans, presidents have issued orders to protect the homeland from internal threats. Most notably, Lincoln ordered the suspension of the privilege of the writ of habeas corpus in 1861 and 1862 before seeking congressional legislation to undertake such an act. Presidents hire and fire military commanders; they also use their power as commander-in-chief to aggressively deploy U.S. military force. Congress rarely has taken the lead over the course of history, with the War of 1812 being the lone exception. Pearl Harbor was a salient case where Congress did make a clear and formal declaration when asked by FDR. However, since World War II, it has been the president and not Congress who has taken the lead in engaging the United States in military action outside the nation’s boundaries, most notably in Korea, Vietnam, and the Persian Gulf (Figure 7.20).
Figure 7.20 By landing on an aircraft carrier and wearing a flight suit to announce the end of major combat operations in Iraq in 2003, President George W. Bush was carefully emphasizing his presidential power as commander-in-chief. (credit: Tyler J. Clements)
Presidents also issue executive agreements with foreign powers. Executive agreements are formal agreements negotiated between two countries but not ratified by a legislature as a treaty must be. As such, they are not treaties under U.S. law, which require two-thirds of the Senate for ratification. Treaties, presidents have found, are particularly difficult to get ratified. And with the fast pace and complex demands of modern foreign policy, concluding treaties with countries can be a tiresome and burdensome chore. That said, some executive agreements do require some legislative approval, such as those that commit the United States to make payments and thus are restrained by the congressional power of the purse. But for the most part, executive agreements signed by the president require no congressional action and are considered enforceable as long as the provisions of the executive agreement do not conflict with current domestic law.
The American Presidency Project has gathered data outlining presidential activity, including measures for executive orders and signing statements.
THE POWER OF PERSUASION
The framers of the Constitution, concerned about the excesses of British monarchial power, made sure to design the presidency within a network of checks and balances controlled by the other branches of the federal government. Such checks and balances encourage consultation, cooperation, and compromise in policymaking. This is most evident at home, where the Constitution makes it difficult for either Congress or the chief executive to prevail unilaterally, at least when it comes to constructing policy. Although much is made of political stalemate and obstructionism in national political deliberations today, the framers did not want to make it too easy to get things done without a great deal of support for such initiatives.
It is left to the president to employ a strategy of negotiation, persuasion, and compromise in order to secure policy achievements in cooperation with Congress. In 1960, political scientist Richard Neustadt put forward the thesis that presidential power is the power to persuade, a process that takes many forms and is expressed in various ways.51 Yet the successful employment of this technique can lead to significant and durable successes. For example, legislative achievements tend to be of greater duration because they are more difficult to overturn or replace, as the case of health care reform under President Barack Obama suggests. Obamacare has faced court cases and repeated (if largely symbolic) attempts to gut it in Congress. Overturning it will take a new president who opposes it, together with a Congress that can pass the dissolving legislation.
In some cases, cooperation is essential, as when the president nominates and the Senate confirms persons to fill vacancies on the Supreme Court, an increasingly contentious area of friction between branches. While Congress cannot populate the Court on its own, it can frustrate the president’s efforts to do so. Presidents who seek to prevail through persuasion, according to Neustadt, target Congress, members of their own party, the public, the bureaucracy, and, when appropriate, the international community and foreign leaders. Of these audiences, perhaps the most obvious and challenging is Congress.
Read “Power Lessons for Obama” at this website to learn more about applying Richard Neustadt’s framework to the leaders of today.
Much depends on the balance of power within Congress: Should the opposition party hold control of both houses, it will be difficult indeed for the president to realize their own objectives, especially if the opposition is intent on frustrating all initiatives. However, even control of both houses by the president’s own party is no guarantee of success or even of productive policymaking. For example, neither Barack Obama nor Donald Trump achieved all they desired despite having favorable conditions for the first two years of their presidencies. In times of divided government (when one party controls the presidency and the other controls one or both chambers of Congress), it is up to the president to cut deals and make compromises that will attract support from at least some members of the opposition party without excessively alienating members of their own party. Both Ronald Reagan and Bill Clinton proved effective in dealing with divided government—indeed, Clinton scored more successes with Republicans in control of Congress than he did with Democrats in charge.
It is more difficult to persuade members of the president’s own party or the public to support a president’s policy without risking the dangers inherent in going public. There is precious little opportunity for private persuasion while also going public in such instances, at least directly. The way the president and their staff handle media coverage of the administration may afford some opportunities for indirect persuasion of these groups. It is not easy to persuade the federal bureaucracy to do the president’s bidding unless the chief executive has made careful appointments. When it comes to diplomacy, the president must relay some messages privately while offering incentives, both positive and negative, in order to elicit desired responses, although at times, people heed only the threat of force and coercion.
While presidents may choose to go public in an attempt to put pressure on other groups to cooperate, most of the time they “stay private” as they attempt to make deals and reach agreements out of the public eye. The tools of negotiation have changed over time. Once chief executives played patronage politics, rewarding friends while attacking and punishing critics as they built coalitions of support. But the advent of civil service reform in the 1880s systematically deprived presidents of that option and reduced its scope and effectiveness. Although the president may call upon various agencies for assistance in lobbying for proposals, such as the Office of Legislative Liaison with Congress, it is often left to the chief executive to offer incentives and rewards. Some of these are symbolic, like private meetings in the White House or an appearance on the campaign trail. The president must also find common ground and make compromises acceptable to all parties, thus enabling everyone to claim they secured something they wanted.
Complicating Neustadt’s model, however, is that many of the ways he claimed presidents could shape favorable outcomes require going public, which as we have seen can produce mixed results. Political scientist Fred Greenstein, on the other hand, touted the advantages of a “hidden hand presidency,” in which the chief executive did most of the work behind the scenes, wielding both the carrot and the stick.52 Greenstein singled out President Dwight Eisenhower as particularly skillful in such endeavors.
OPPORTUNITY AND LEGACY
What often shapes a president’s performance, reputation, and ultimately legacy depends on circumstances that are largely out of their control. Did the president prevail in a landslide or was it a closely contested election? Did they come to office as the result of death, assassination, or resignation? How much support does the president’s party enjoy, and is that support reflected in the composition of both houses of Congress, just one, or neither? Will the president face a Congress ready to embrace proposals or poised to oppose them? Whatever a president’s ambitions, it will be hard to realize them in the face of a hostile or divided Congress, and the options to exercise independent leadership are greater in times of crisis and war than when looking at domestic concerns alone.
Then there is what political scientist Stephen Skowronek calls “political time.”53 Some presidents take office at times of great stability with few concerns. Unless there are radical or unexpected changes, a president’s options are limited, especially if voters hoped for a simple continuation of what had come before. Other presidents take office at a time of crisis or when the electorate is looking for significant changes. Then there is both pressure and opportunity for responding to those challenges. Some presidents, notably Theodore Roosevelt, openly bemoaned the lack of any such crisis, which Roosevelt deemed essential for him to achieve greatness as a president.
People in the United States claim they want a strong president. What does that mean? At times, scholars point to presidential independence, even defiance, as evidence of strong leadership. Thus, vigorous use of the veto power in key situations can cause observers to judge a president as strong and independent, although far from effective in shaping constructive policies. Nor is such defiance and confrontation always evidence of presidential leadership skill or greatness, as the case of Andrew Johnson should remind us. When is effectiveness a sign of strength, and when are we confusing being headstrong with being strong? Sometimes, historians and political scientists see cooperation with Congress as evidence of weakness, as in the case of Ulysses S. Grant, who was far more effective in garnering support for administration initiatives than scholars have given him credit for.
These questions overlap with those concerning political time and circumstance. While domestic policymaking requires far more give-and-take and a fair share of cajoling and collaboration, national emergencies and war offer presidents far more opportunity to act vigorously and at times independently. This phenomenon often produces the rally around the flag effect, in which presidential popularity spikes during international crises. A president must always be aware that politics, according to Otto von Bismarck, is the art of the possible, even as it is a president's duty to increase what might be possible by persuading both members of Congress and the general public of what needs to be done.
Finally, presidents often leave a legacy that lasts far beyond their time in office (Figure 7.21). Sometimes, this is due to the long-term implications of policy decisions. Critical to the notion of legacy is the shaping of the Supreme Court as well as other federal judges. Long after John Adams left the White House in 1801, his appointment of John Marshall as chief justice shaped American jurisprudence for over three decades. No wonder confirmation hearings have grown more contentious in the cases of highly visible nominees. Other legacies are more difficult to define, although they suggest that, at times, presidents cast a long shadow over their successors. It was a tough act to follow George Washington, and in death, Abraham Lincoln’s presidential stature grew to extreme heights. Theodore and Franklin D. Roosevelt offered models of vigorous executive leadership, while the image and style of John F. Kennedy and Ronald Reagan influenced and at times haunted or frustrated successors. Nor is this impact limited to chief executives deemed successful: Lyndon Johnson’s Vietnam and Richard Nixon’s Watergate offered cautionary tales of presidential power gone wrong, leaving behind legacies that include terms like Vietnam syndrome and the tendency to add the suffix “-gate” to scandals and controversies.
Figure 7.21 The youth and glamour that John F. Kennedy and first lady Jacqueline brought to the White House in the early 1960s (a) helped give rise to the legend of “one brief shining moment that was Camelot” after Kennedy’s presidency was cut short by his assassination on November 22, 1963. Despite a tainted legacy, President Richard Nixon gives his trademark “V for Victory” sign as he leaves the White House on August 9, 1974 (b), after resigning in the wake of the Watergate scandal.