Affirmative action in the United States
Based on Wikipedia: Affirmative action in the United States
In 1865, General William Tecumseh Sherman stood on the soil of Georgia and proposed a radical act of restitution: divide the land and goods confiscated from Confederates and grant them to freed Black slaves. The vision was simple yet seismic—forty acres and a mule. It was an acknowledgment that freedom without economic foundation was a hollow promise, a legal status without the material means to sustain it. That proposal was met with immediate and fierce resistance. President Andrew Johnson, a man who viewed the Reconstruction era through the lens of a bitter reunion rather than a moral reckoning, swiftly revoked Sherman's military orders. When Johnson vetoed the Civil Rights Act of 1866, he did so with a rationale that would echo through the next century and a half of American policy debates. He argued that the bill established safeguards for the "colored race" that went "infinitely beyond any that the general government has ever provided for the white race." To Johnson, and to the forces of reaction he represented, any policy designed to assist the formerly enslaved was not justice, but "a wrongful venture in racial favoritism aimed at assisting the undeserving Negro."
This tension, born in the ashes of the Civil War, has defined the American experiment ever since. It is the friction between the ideal of a colorblind meritocracy and the reality of a history stained by systemic exclusion. Today, as the dust settles on a new legal era following the Supreme Court's 2023 decision in Students for Fair Admissions v. Harvard, the nation finds itself once again at a crossroads, forced to confront whether the mechanisms designed to repair the fractures of the past are still necessary, or if they have become the very source of new divisions.
The Architecture of Exclusion and the Birth of a Remedy
To understand affirmative action, one must first understand the architecture of exclusion it was built to dismantle. It did not emerge in a vacuum during the 1960s. The term itself first appeared in the National Labor Relations Act of 1935, the Wagner Act, championed by Senator Robert F. Wagner. At the time, the context was not racial integration but labor rights. The act was designed to protect workers from being blacklisted or fired for unionizing, empowering the National Labor Relations Board to ensure fair employment practices. It was a tool for economic security, a way to level the playing field for low-income groups in the throes of the Great Depression.
However, the concept of using federal power to actively correct imbalances for specific groups took a different, more contentious turn in the 1930s and 40s. The Public Works Administration (PWA) implemented an innovative, though unpublicized, policy requiring private construction firms hired for public housing projects to hire Black workers. About 13% of these new hires were Black, a statistic that marked a quiet but significant shift in the federal approach to labor. Yet, this policy was short-lived, ending by 1941 as the nation's focus shifted to the war effort and the political will to enforce such integration waned.
The true catalyst for the modern affirmative action era was the Civil Rights Movement of the 1950s and 60s. As the movement gained momentum, it became clear that simply removing legal barriers—the "negative" rights of non-discrimination—was insufficient to overcome the deep-seated disadvantages of centuries of slavery and Jim Crow. The Civil Rights Act of 1964, specifically Title VII, prohibited racial discrimination in firms with over 25 employees. But the law was reactive; it punished discrimination after it happened. It did not actively push institutions to change.
That changed in 1961, when President John F. Kennedy issued Executive Order 10925. This order required government contractors to take "affirmative action to ensure that applicants are employed, and that employees are treated during employment, without regard to their race, creed, color, or national origin." The phrase "affirmative action" had shifted from a labor protection to a mandate for proactive inclusion. It was a recognition that neutrality in an unequal system often perpetuates inequality.
The policy evolved rapidly under the Nixon administration. The Revised Philadelphia Plan, implemented in 1969, was the first federal policy of race-conscious affirmative action. It required government contractors in the construction industry to set "goals and timetables" for integrating their workforce. This was a direct response to the entrenched exclusion of Black workers from skilled trades in Philadelphia and beyond. The plan was controversial, facing immediate legal and political challenges, but it established a precedent: the government could not just wait for discrimination to stop; it had to demand progress.
The Legal Labyrinth: Quotas, Goals, and the Supreme Court
As affirmative action moved from executive orders to the courtrooms, the American legal system was forced to grapple with the constitutionality of race-conscious policies. The central question was a paradox: could the Equal Protection Clause of the Fourteenth Amendment, designed to protect minorities, be violated by using race as a factor to help them?
The first major flashpoint came in 1978 with Regents of the University of California v. Bakke. Allan Bakke, a white applicant, had been rejected twice by the University of California, Davis Medical School. He sued, claiming that the school's use of racial quotas, which reserved 16 out of 100 seats for minority applicants, violated the Civil Rights Act of 1964 and the Constitution. The Supreme Court's decision was a fractured masterpiece of compromise. It ruled that the specific quota system used by UC Davis was unconstitutional, effectively banning rigid numerical quotas. However, in a surprising twist, the Court also held that race could be used as one of many factors in admissions to achieve the compelling interest of student body diversity.
This decision created a complex legal landscape. It banned the blunt instrument of quotas but opened the door for the more subtle mechanism of holistic review. Institutions were told they could not count heads by race, but they could consider race as a "plus" factor in the context of an individual's application. This distinction became the bedrock of affirmative action for the next forty-five years.
The tension flared again in the 1990s. In Hopwood v. Texas (1996), the Fifth Circuit Court of Appeals ruled that the University of Texas Law School's affirmative action program was unconstitutional, effectively barring race-conscious admissions in Louisiana, Mississippi, and Texas until the Supreme Court intervened. This decision sent shockwaves through higher education, creating a legal patchwork where the rules of admission depended on geography.
The Supreme Court finally addressed this chaos in Grutter v. Bollinger in 2003. The case involved Barbara Grutter, a white applicant rejected by the University of Michigan Law School. In a 5-4 decision, the Court upheld the university's admissions policy, affirming that achieving a diverse student body was a compelling state interest. Justice Sandra Day O'Connor, writing for the majority, famously declared that the use of race in admissions should be limited in time, predicting that "25 years from now, the use of racial preferences will no longer be necessary." She envisioned a future where the legacy of discrimination had been sufficiently healed to allow for a truly colorblind society.
The State-Level Rebellion and the Rise of Bans
While the federal courts were wrestling with the nuances of diversity, a different battle was being waged in the states. The 1990s and 2000s saw a surge in voter-led initiatives to ban affirmative action, driven by a growing belief that these policies were a form of reverse discrimination. California led the charge in 1996 with Proposition 209, which banned the state from considering race, sex, or ethnicity in public employment, education, and contracting. The measure passed with 54% of the vote, signaling a shift in the national mood.
The movement spread like wildfire. Washington followed in 1998 with Initiative 200, though it was later rescinded in 2022. Florida banned the practice in 1999 via an executive order by Governor Jeb Bush. Michigan joined the list in 2006, Nebraska in 2008, Arizona in 2010, New Hampshire in 2012, Oklahoma in 2012, and Idaho in 2020. In total, nine states now explicitly ban race-based affirmative action in employment and, in most cases, education. The methods varied: some were passed by legislatures, others by executive order, but most were approved by voters at the ballot box, reflecting a deep-seated public skepticism about government-mandated racial preferences.
These bans had immediate and measurable effects. In California and Washington, the number of minority students admitted to flagship universities like UC Berkeley and the University of Washington plummeted, leading to a decades-long struggle to diversify the campuses through race-neutral alternatives like socioeconomic-based admissions or percentage plans. The bans created a stark reality: in states where affirmative action was prohibited, the representation of Black and Latino students in elite public institutions often fell to levels not seen since the pre-Civil Rights era.
The Final Blow: *Students for Fair Admissions v. Harvard*
For decades, the legal framework established by Grutter held firm, despite mounting pressure. But the political and legal tides had been shifting. The rhetoric surrounding affirmative action had evolved. By the 2020s, the language of "affirmative action" was increasingly replaced by the broader, less legally contentious concepts of "diversity, equity, and inclusion." Yet, the core legal question remained: can a university consider an applicant's race to build a diverse class?
In 2023, the Supreme Court answered with a resounding no. In Students for Fair Admissions v. Harvard and its companion case Students for Fair Admissions v. UNC, the Court issued a landmark ruling that effectively ended race-based affirmative action in college admissions nationwide. The majority opinion, written by Chief Justice John Roberts, held that the admissions programs at Harvard and the University of North Carolina violated the Equal Protection Clause of the Fourteenth Amendment. The Court reasoned that the use of race as a factor in admissions was not sufficiently narrowly tailored and that the goals of diversity could be achieved through race-neutral means.
The decision was a seismic shift in American higher education. It dismantled the 45-year precedent set by Bakke and Grutter. The Court's ruling also highlighted the violation of Title VI of the Civil Rights Act of 1964, which prohibits discrimination in any program receiving federal financial assistance. For many, this was the culmination of a long legal campaign by opponents who argued that affirmative action had always been a form of discrimination against white and Asian American applicants. The concurrences in the case underscored this view, with Justice Clarence Thomas arguing that the Constitution is colorblind and that any consideration of race is inherently discriminatory.
The ruling did not explicitly ban all considerations of race. Universities can still consider how an applicant's race has shaped their life experiences, provided it is tied to a specific quality of character or ability the university seeks to cultivate. However, the practical effect is a massive contraction of the ability of institutions to use race as a factor in admissions. The era of explicit race-conscious admissions is over, replaced by a new, uncertain landscape where universities must scramble to find race-neutral ways to maintain diversity.
The Human Cost: A Debate of Principles and Realities
The debate over affirmative action has always been more than a legal technicality; it is a battle over the soul of American society. Supporters argue that these policies are a necessary corrective to centuries of systemic oppression. They point to the enduring reality of bias. Studies consistently show that job seekers with African American-sounding names receive fewer callbacks than those with white-sounding names, even with identical resumes. They argue that affirmative action is not obsolete because the playing field is still not level. Supporters contend that these policies promote equality and representation for groups that are socioeconomically disadvantaged or have faced historical discrimination, countering the continuing prejudice that plagues women and minorities.
For the millions of women and minorities who have benefited from these policies, affirmative action has been a lifeline. It opened doors to elite universities, high-paying jobs, and positions of leadership that would have otherwise remained closed. It created a professional class of Black and Latino doctors, lawyers, and executives who serve as role models and mentors for future generations. The argument is that without these interventions, the cycle of disadvantage would have continued unchecked.
Conversely, opponents argue that affirmative action constitutes a new form of racism. They claim that these policies discriminate against other racial and ethnic groups, particularly Asian Americans and white Americans, by favoring one group over another based on racial preference rather than individual achievement. The argument is that merit should be the sole criterion for selection, and that any deviation from this principle undermines the integrity of institutions. Opponents also suggest that the diversity of current American society proves that affirmative action has succeeded and is no longer required. They point to the success stories of minority leaders as evidence that the system works without special preferences.
There is a more nuanced criticism, however, often voiced by scholars of color themselves. Some argue that affirmative action tends to benefit the most privileged within minority groups—the children of wealthy Black or Latino families—at the expense of the least fortunate within the majority groups. They suggest that the policy has become a tool for the middle class of all races to advance, rather than a mechanism for the truly disadvantaged. Others, particularly in higher education, worry that affirmative action can inadvertently harm the very students it aims to help by placing them in academic environments where they are underprepared, leading to high dropout rates and a sense of isolation.
The human cost of this debate is measured in the lives of students denied entry to the universities of their dreams, and in the professionals who never get the chance to practice their craft. It is measured in the frustration of a white family watching their child rejected in favor of a minority applicant, and in the despair of a Black student who knows that without a specific policy, their application might never even be read. It is a debate where every side feels like a victim, and every side feels justified in their anger.
The Future in a Post-Affirmative Action Era
As we move forward from the 2023 Supreme Court decision, the United States faces a profound uncertainty. The rhetoric of "affirmative action" has been replaced by "diversity, equity, and inclusion," but the legal tools to achieve these goals have been severely curtailed. Universities are now forced to innovate, exploring new ways to recruit and retain a diverse student body without explicitly considering race. Some are turning to socioeconomic-based admissions, hoping that class correlates with race. Others are expanding outreach programs and partnerships with underfunded high schools.
But can these race-neutral methods achieve the same results? Early evidence from states that banned affirmative action decades ago suggests they cannot. The diversity of flagship universities in California and Texas has stagnated or declined in the absence of race-conscious admissions. The promise of a representative public institution remains elusive.
The legacy of affirmative action is complex and contradictory. It was born of a noble ideal: to repair the deep wounds of a nation torn apart by slavery and segregation. It has succeeded in creating a more diverse workforce and a more representative leadership class. But it has also fueled a backlash that has deepened racial divisions and led to a legal environment where the very concept of racial redress is under siege.
The story of affirmative action is the story of America itself—a story of aspiration and failure, of progress and backlash. It is a story of a nation struggling to live up to its founding promises while grappling with the weight of its history. As the courts close the door on race-based admissions, the question remains: can a society that has spent a century trying to fix its racial inequalities now find a way to do it without using race at all? Or will the removal of these policies simply reveal the depth of the chasm that affirmative action was trying to bridge?
The answer is not yet written. But the stakes could not be higher. The future of American democracy depends on our ability to create a society where every individual, regardless of their background, has a fair chance to succeed. Whether that requires affirmative action, or something entirely new, remains the great unanswered question of our time. The history of forty acres and a mule, of the Revised Philadelphia Plan, of the Supreme Court's shifting precedents, all point to one conclusion: the work of justice is never done. It is a continuous struggle, a constant reimagining of what it means to be a nation of equals. And as long as that struggle continues, the debate over affirmative action will remain at the heart of the American experience.