The Supreme Court’s decision to end affirmative action in college admissions has not so much settled a debate as it has begun a whole host of new ones. While Students for Fair Admissions v. Harvard maintains that colleges and universities are not permitted to recreate their affirmative action regimes in the aggregate through other admissions methods, such as the personal essay, schools may still consider “an applicant’s discussion of how race affected his or her life, be it through discrimination, inspiration or otherwise,” in the words of Chief Justice John Roberts’s majority opinion. Similarly, nonacademic factors such as geographic location and socioeconomic status remain accepted grounds on which to consider applicants.
This obviously leaves ample room for subjectivity on behalf of colleges in orienting their admissions processes, including in directions that still prioritize racial status, leaving the future of the college admissions landscape rather unsettled.
FTC INVESTIGATING CHATGPT OVER ALLEGATIONS IT HARMED CONSUMERS BY PUBLISHING FALSE INFORMATION
And no sooner had the Supreme Court struck down Harvard’s affirmative action policies than public debate immediately turned to yet narrower questions of matriculation, most notably the use of legacy admissions by Ivy League and other prestigious institutions. “If SCOTUS was serious about their ludicrous ‘colorblindness’ claims, they would have abolished legacy admissions, aka affirmative action for the privileged,” Rep. Alexandria Ocasio-Cortez (D-NY) tweeted following the decision. A lawsuit has already been filed against Harvard’s use of legacy admissions, or the preferential consideration given to children of alumni. Similar calls for the abolition of legacy policies have continued over the past few weeks, including from President Joe Biden and other politicians, university presidents and professors, and high-profile voices across the political spectrum.

At the heart of this debate is the idea of what should not be determining admissions: We do not want race, by itself, determining who is able to attend highly selective colleges and universities; we do not want the inherited privilege of alumni status, typically concentrated among those who are wealthy and often white, carrying the day for certain applicants, and so on.
While diversity of experience, socioeconomic class, geographical residency, and background are all desirable factors to consider when seeking to compose a well-rounded freshman class or cohort of undergraduates, one would hope the primary criteria of consideration for applicants would be that of academic merit. College is, after all, an academic endeavor. Students graduating high school with a record of scholastic achievement are, on average, better suited to the demands and expectations of undergraduate coursework — and, in an academic sense, somewhat more deserving of the still-limited opportunity to study at an institution putatively dedicated to higher learning than those with no such record of achievement.
Yet academic merit has been remarkably absent from the recent debates over college admissions. Indeed, it is as if the entire pretext that college-going is itself about educational attainment is being abandoned in real time. More than 80% of four-year colleges and universities did not require SAT or ACT standardized test scores for fall 2023 admissions, according to a nationwide list compiled by the National Center for Fair & Open Testing, or FairTest. What began largely as a COVID-era convenience has quickly been enshrined as a new measure of equity and fairness in admissions at schools across the land, including highly selective, academically prestigious institutions such as Harvard, Stanford University, and Columbia University. In the wake of the SSFA v. Harvard decision, a number of schools, such as those in the University of California system, announced they will find new ways to prioritize “adversity” and other such experiential metrics as a way to arrive at a racial and ethnicity emphasis within their admittance standards. At the University of California, Davis, School of Medicine, this has resulted in the ranking of applicants based on an “adversity score,” determined by the self-reported disadvantages prospective students have faced.

Meanwhile, public trust in higher education continues to sink with each passing year. Even as arguments were swiftly shifting from affirmative action to legacy admissions, Gallup released its latest survey findings, reporting that the public’s overall confidence in higher education has fallen “to 36%, sharply lower than in two prior readings in 2015 (57%) and 2018 (48%).” While trust declined most sharply for Republicans, dropping a massive 37 points from 2015 to 19%, confidence in higher education fell among both independents and Democrats alike, resting at 32% and 59%, respectively.
The public, it seems, is waking up to the fact that higher education has become vastly untethered from its original purposes. At the same time, it remains as central to socioeconomic advancement as it has ever been. It is a near-mandatory barrier to middle-class employment, one that is increasingly more expensive. Employers overwhelmingly prioritize college degrees in hiring. Thus, it is incredibly important who gets into and graduates from college. Fairness and representation have become the paramount concerns, above merit, academic ability, or even likelihood of graduation and coursework completion.
This is why debates over things such as legacy admissions are so sexy and attention-grabbing. There is a deep sense of unfairness when the progeny of the rich are reserved spots at the institutional system most associated with wealth creation and career remuneration, particularly when merit is removed from the equation. Yet when set against the broader reality of our K-12-to-college-to-workforce career pathway, such debates are little more than red herrings.
It feels much safer, much easier to argue about a privileged minority in legacy admissions, which, according to professor Evan Mandery’s Poison Ivy, represents roughly 14% of Harvard’s entering class, than it does to take and grapple with the fundamental failures of the higher education system as a whole. Consider what we know about both the inputs and the outputs of our current system.
College applicants, as we have seen, increasingly no longer have to prove academic achievement through standardized test scores. Such scores are an admittedly imperfect measure of intellectual aptitude, but the alternatives hinge almost entirely on subjectivity. While GPA remains somewhat involved in the admissions game, it is being watered down in favor of personal experience and demographic attachments. At the same time, the K-12 farm system, if you will, produces its own uneven measures of results. High school graduation rates around the country remain high even as individual metrics of reading, mathematics, and writing skill paint a bleak picture of student proficiency, with the latest results from the “Nation’s Report Card” National Assessment of Educational Progress scores suffering even worse under the effects of COVID-era remote schooling policies.
The outputs of the higher education system, the graduates themselves, are also suspect. The National Student Clearinghouse Research Center found that roughly 62% of full-time students at all four-year institutions graduate within six years — six years, mind you, that’s two years and four semesters longer than what should be the standard rate of completion. Meanwhile, grade inflation is a well-recognized phenomenon all across the undergraduate landscape. As remarked recently by Derek Thompson and David Perell, the average GPA at Harvard between 1890 and 1950 was 2.5. It was 3.0 in 1960, and it’s a whopping 3.8 today. A middling six-year graduation rate even as grades improve does not make for much confidence in the level of learning being acquired by undergraduates. What they are acquiring is loads of loan debt, of course, along with routinely liberal-coded social and ideological conditioning.
It strikes me we no longer understand what college should be for, beyond a six-year holding pattern before middle- and upper-class employment. Even while we focus on designs to make admissions fairer and broader, so that the opportunities to matriculate into this system are more evenly distributed, it seems we’ve begged the question of why the system should exist in its current state at all. Because, from whichever angle you wish to consider it, what we have now is failing in whatever goal you deem most important. If you view colleges and universities as places of deep learning, dedicated to academic rigor and aptitude, that goal is ill-served by our get-them-all-in-and-graduate-as-many-as-possible mindset. If your priority for the college system is increasing equity for socioeconomic mobility, itself a noble goal, surely a more streamlined credentialing process that does not require the traditionally underprivileged and poorer to forgo years of earning would better suit such needs.

If we have lost the plot on our higher education system, the focus of future debates should not be on red herrings but on rethinking a different reality. This involves digging into barriers preventing system change, but it also involves asking deeper questions regarding what higher education should be for. The benefit of system innovation is that higher education operating with new modes and orders does not have to be all things for all people under one model, as it is now. Departing from a four-year institutionally controlled system that attempts to achieve equity, employment preparation, intellectual exploration, and scholarly patronage means you can uncouple such goals from one another, letting various sectors of a new system pursue each endeavor more thoroughly, rather than having them each clash against another.
CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER
While there are legal obstacles to revolutionizing our system — principal among them being the disparate impact discrimination doctrine around employer hiring that itself has elevated the college degree to its privileged place in our labor market — as well as self-interested agendas of colleges, bureaucracies, and current stakeholders, I submit that the biggest obstacle is our fear of letting pluralism reign. Innovation might feel unfair, after all, and we just can’t have that. We have arrived at our present situation due in large part because of the fear that someone, somewhere might be excluded. Uniformness and bigness rule the day, the system growing and metastasizing into a bloated, self-perpetuating mass designed to be all things to all people. Not everyone should go to college, not everyone is equipped to, nor does everyone want to. To say as much is not a value judgment or negative reflection on those people despite how advocates and the zeitgeist writ large might disagree.
A move toward a more pluralistic system populated with various iterations of postsecondary pathways — streamlined credential programs, public-private workforce partnerships, smaller, more exclusive, and classically differentiated universities revivified under the strict goals of scholarship, as opposed to employment training — does by necessity discriminate in purpose and design but that does not make it discriminatory in the way we typically understand it now. Distinction is good and valuable, regardless of how afraid of it we have become.
J. Grant Addison is deputy editor of the Washington Examiner magazine.