In 1983, the National Commission of Excellence in Education issued a report— “A Nation at Risk”— which claimed that U.S. schools were failing. In response, schools increased their focus on STEM classes, standardized testing, and enrollment of students in college. As a direct result, in 1994 75% of high school seniors in the U.S. enrolled in a postsecondary institution. The origin of the college-for-all mentality is widely attributed to the 1983 report.

“A Nation at Risk” led to a decline of support for minority students in education due to tracking. “Tracking” is when students of lower socioeconomic statuses—or students with limited English-speaking abilities—are encouraged to enter fields where there is limited upward mobility. Tracking was especially present in vocational programs that prepared non-college bound students for a career after high school.

In the 1990s, traditional vocational education programs received pushback due to the tracking that was taking place. The argument against tracking was that poor students were kept in the cycle of poverty through educational programs.

Due to the negative reactions to tracking, schools and society overcorrected in 2001 with No Child Left Behind and put everyone on the college-prep track. This led to degree inflation, the rise in demand for bachelors degrees for jobs that do not actually require one.

Today, jobs require postsecondary degrees for entry level jobs that might not actually require said degree or education. Only 24% of students who graduate college get a job that requires a college degree, and yet if you look on any job search engine, nearly all entry-level jobs will require a college degree.

Does college really work for all? The short answer is no.

The dropout rate for undergraduate students is 40%, with 30% of first-year college students dropping out before their second year. There are around 19.6 million college students in the United States. That means that each year, 7.84 million students will spend time and money at colleges and universities only to find it is not for them. If the college-for-all mentality continues, more students will be forced to attend college and the number of students negatively impacted will increase.

An in-state college freshman at the University of Texas-Austin who drops out before his second year will have wasted a year of his life and almost $30,000 on tuition, textbooks, room and board, etc. Those students who drop out have missed out on a year of work experience, do not have a post-secondary degree, and are probably thousands of dollars in debt.

In 2019, 69% of college student in the U.S. took out loans and graduated with an average of $29,900 in debt—but at least they had a college degree to show for it. What do those students who drop out have to show for their time—other than a financial situation worse than it would have been if they never went to college in the first place?

My family is not immune to the negative side effects of college-for-all. In the early 1990s, my parents went to college—for the first time. Neither of them was emotionally nor mentally ready to go to college, so they both dropped out. They met one another, got married, had me, and then decided to go back to school. I even walked across the stage with my dad.

While their story ended happily, the first few years after dropping out were not easy for them. They had loans to pay off, jobs to finds, and skills to make up for—and that was the ‘90s. The college-for-all mentality is more pervasive in our society with every year that passes.

Historically, college was a choice, not a requirement for a high-paying, rewarding job. College-for-all should mean that anyone who wants to go to college, should be able to do so, but not everyone should have to go to college—or feel pressured to. No one should be kept from success by a social mandate to obtain postsecondary schooling. No two people are the same, so their learning journey should not have to be, either.