Not the Cream of the Crop

Republicans in Alabama are facing a nightmare scenario in their upcoming special election—either they elect to the Senate Doug Jones, a Democrat who does not share their values on important issues like abortion, or Roy Moore, a Republican who has been credibly accused of sexual improprieties with teenage girls.

How did it come to this? There are a lot of ways to answer that question, but the proximate cause is that in the September primary, roughly 262,000 Republican primary voters supported Moore over incumbent Luther Strange (who got 218,000). By way of comparison, Donald Trump won 1.3 million votes in last year’s general election in Alabama. So Moore won the Republican nomination by winning about 20 percent of the Republican electorate in the state.

Alabama is not the only place where a very narrow slice of the electorate has made dubious and consequential choices in primaries. In recent election cycles, Republican voters have thrown away winnable races by nominating such notable clunkers as Todd Akin in Missouri, Richard Mourdock in Indiana, and Christine O’Donnell in Delaware. And that’s not counting their judgment in presidential primaries.

Is it perhaps time to reconsider the merits of nominating primaries as a way to select quality candidates for office?

Primaries were popularized in the 20th century. Along with the direct election of senators, they are part and parcel of the progressive movement’s belief that more democracy is always better.

This is a faith that most of the Founding Fathers did not share. To be precise, the Founders came from the republican political tradition and indeed had quite radical views regarding it. They thought that government belongs to the people at large and, unlike many European republicans, rejected the legitimacy of any hereditary estates ruling in conjunction with the people. The Constitutional Convention was in large measure an effort to save this radical republicanism from the excesses of democracy.

The state governments of the 1780s were highly democratic as well as terribly behaved. They treated political minorities disdainfully, wrote and revised laws too frequently, and did not contribute to the welfare of the nation. Elbridge Gerry put it bluntly when, at the convention, he argued, “The evils we experience flow from the excess of democracy.” In James Madison’s view, the Constitution was a way to deal with “the inconveniencies of democracy” while remaining “consistent with the democratic form of government.”

The original constitutional schema placed the people directly in charge of only one institution—the House of Representatives. The Senate was selected by state legislatures, and the president by the Electoral College. This is a very republican system, but not a democratic one.

An integral ingredient in the Constitution is representative government. The people do not rule directly, but rather through (one hopes) a wise and enlightened class of mediators whom they select. It soon became clear, however, that the people needed more guidance in choosing representatives than the Constitution offered. This is how the first political parties sprung into being—to frame the debate for the public, to educate it, and to recommend estimable candidates who could achieve the party’s goals in government.

Still, the role of party nominations in our system has rarely been given much thought. This is unfortunate, because it gets to the same problem with which the Founders struggled: What is the proper balance between the masses and the elites in public affairs? The Founders were careful in designing the Constitution not to move too far in one direction or the other, striking a middle ground between popular sovereignty and the mediating voices of what they hoped would be a caste of political leaders. Subsequent generations have treated the process of nominating potential members of that caste in a slapdash manner.

Within the first political parties, the process of nomination was largely the domain of the elites alone. Presidential nominations, for instance, were made by the parties’ congressional caucuses. But the surge of democratic sentiment in the 1830s made this seem too high-toned, and nominating conventions—which brought in a wider array of public voices—were later employed.

As the 19th century unfolded, the convention process became widely corrupted. Party organizations were funded by government patronage (jobs, contracts, emoluments, and so on), which had precious little to do with public service. After the patronage system was done away with on the federal level, industrial and financial magnates stepped in, subsidizing party machines in the states to dominate politics for their own benefit.

It was this problem that the progressives were rebelling against. And from this perspective, the primary election seemed an eminently sensible device: If the elites were corrupted, power should be taken from them and given to the people. Over the course of the 20th century, that is precisely what happened, as primaries became the main way nominations were decided.

Yet it is not at all clear that the people at large even care enough to make such decisions. Turnout in most primaries is so embarrassingly low that it is unreasonable to conclude the primary vote is representative of any but the views of the most interested factions within a party—and sometimes those can be quite extreme.

Another problem is that primaries happen without the aid of partisan identification that voters enjoy in general elections. This is an enormously useful heuristic device for voters who do not take the time to learn the ins and outs of policies. They rely on that label—Democratic or Republican—to get a rough sense of what the candidate will do in office. But in an intraparty primary, there are no such labels. Worse, voters in a primary are asked to deliberate on finer-grained matters than a candidate’s general disposition on the issues. It’s about character, integrity, intelligence, independence—the sorts of qualities that can be the hardest to tease out, especially nowadays when the main interaction with candidates is via TV commercials.

Primaries have also proven very ineffective at removing bad actors from government, one of the main reasons they were introduced in the first place. Lacking sufficient information, voters have a natural disposition to stick with the status quo. This conservatism is reinforced by the fact that incumbents are usually well financed and can scare away serious challengers. Far from being a tool to clean politics of corruption, primaries too often now have the opposite effect.

Does this mean a return to nominating conventions is warranted? No. Conventions suffer from many of the same problems. The influence of the smoke-filled room of yore is overstated, but there is no doubt that convention delegates can cut corrupt deals. Moreover, conventions often attract the ideological diehards, creating circumstances in which the nominees do not reflect the party at large.

Instead, what is really needed is some serious thought. For too long, public intellectuals in the United States have taken for granted the process of party nominations, accepting without question the normative value of the primary. The truth is that just as parties are essential to democratic governance, a sensible system of party nominations is necessary to make sure that government is actually staffed with people with the right temperament, education, and attachments to the community. We do not have that in this country right now, and that is a big problem.

Jay Cost is a contributing editor THE WEEKLY STANDARD.

Related Content