We’re now 2+ years into the tech downturn. Companies, with or without AI as a driver, continue to announce major layoffs, seemingly every month or two. No one advocates for teaching everyone to code anymore.
While it was a relatively short-lived phenomenon, the push for the democratization of coding1 is a microcosm of the larger, decades-long push to democratize higher education. The intentions are correspondingly noble: to provide equal opportunity for forthcoming generations to participate in higher-leverage work, with proven higher salaries and lifetime earnings. The cost, however, in student loans and coding bootcamp tuition debt raises questions about how well these intentions ultimately end up being realized, and the extent which bad actors took advantage of the situation.
But costs aside, the other pertinent question is whether college is imparting the education it’s chartered to provide. Lowering college acceptance criteria, or making classes more accessible to more students, may indeed bolster enrollment numbers, but they don’t magically make each additional student more prepared for college and engaged in curricula that assume an equal amount of prior engagement in high school. The hand-wringing over how AI is used to cheat in college is a bit of a trailing indicator; it speaks more to how the skillset colleges are looking to impart has already been superseded by technology, and bears less relevance to the realities of the modern workplace.
It also raises a related, yet fundamental question of whether students even want to participate. This article from a college professor made the rounds a couple of months back, citing that the students they teach nowadays don’t care to read, write, do math, attend class, take notes, or…uh, stay seated. It paints a pretty bleak picture of what an educated workforce looks like for future generations.
A Bachelor’s degree will both cost more and be worth less than its generational predecessor.
Which gets to the title of this post. Policies and programs designed to enable more access, without accompanying means of catering to each individual’s circumstances, end up just diluting standards. Coding bootcamp grads found out, after their 9-month-long classes and projects, that there was still a major gap between their employability, when compared to college grads who spent 4+ years in school between studies and internships. Students who aren’t reading or writing at a college level won’t be competing for jobs with higher levels of fluency and proficiency requirements.
Closer to home, the San Francisco Unified School District infamously pushed back Algebra I instruction—from 8th grade in middle school, to 9th grade in high school—a decade ago. The school board decided to delay these critical math classes for a year, in a misguided attempt to promote equity; it was intended to give more time for lagging students to catch up. But, facing both public backlash and a lack of results from this experiment, they were forced to reverse that unpopular decision a decade later, punctuated with an overwhelming voter mandate.
Instead of pulling up those who may have lacked the opportunity, these efforts only push down the bar of entry. In fact, success in broadening participation, without doing the hard work of leveling up individuals to maintain the standard, devalues the credentials that initially made the effort worthwhile in the first place. And absent scalable ways to elevate the lower cohorts to meet that high bar, we fall back to our de facto mechanism of exclusivity via stack ranking. It’s more reliable to focus on identifying the top x%—whether for university admissions, entry-level jobs, professional occupations, etc.—than rely on a questionable minimum bar of admission. A system that should have expanded the pie is now rendered back to a zero-sum competition.
I suppose the updated parallel in 2025 is vibe coding, where you don’t even have to learn to code as much as figure out how to prompt.↩