This tweet came across my Twitter timeline over the weekend:
Stuff CS departments should teach, but don’t do very well:
* Source control, esp. git
* How to do effective code review
* How to read API documentation
* Publishing your own library
* Command-line basics
* Debugging and fixing for the long term
* Refactoring w/unit tests
— Jimmy Song (송재준) (@jimmysong) April 21, 2018
Coupled with a few responses from senior leads and managers—many who I respect—who piled on additional attributes which aggregate to best engineering practices. I get the good intentions and what folks are going for.
At the same time, though, I couldn’t help but think through the problematic framing, that this responsibility of training and developing software engineers falls on the shoulders of academic institutions and CS degree programs. The two major issues:
- It’s wholly unrealistic to expect a 4-year degree program to cover this amount of material
- It absolves companies, teams, and managers to train and grow their software engineers.
Let’s walk through these individually.
How Much can you Teach?
Most universities and CS programs will claim that they are focused on computer science, the branch of computing theory and knowledge which powers information technology but is a degree removed from applicability and engineering. The trend in recent years, though, is that CS programs have revised their curricula to add more practical software engineering sections, no doubt in part due to the donations and monetary support from influential tech companies.
There is a reasonable proxy, set by companies themselves: new grad interview processes. While interviews for senior hires are more controversial and often vary by interviewer, the standard for determining technical competence of newly-minted CS grads rests in testing for a set of “CS Fundamentals”, which is really code for the ability to work with data structures and algorithms. Much of the industry built up to entice people to join Google or Facebook comes down to practicing for this set of problems.
It’s telling that over the course of 4+ years of a CS bachelor’s degree, companies can really only nail down about two semesters’ worth of coursework to interview their candidates with1. There are plenty of classes which cover wider aspects of software engineering (e.g., UI/UX design, coding practices, etc.), but none really matter in the interview process and thus are a minor factor in getting a job out of school.
To me, this is a clear sign that companies can’t really rely on schools to provide this additional software development context. Internships and co-ops fill a bit of the gap2, but even then the depth of software engineering—as a discipline with its own tools and norms—is hard to instill in such short amounts of time. In particular, it’s hard to see how and why source control, refactoring, project management, and other facets of development matter if projects don’t survive long enough to demonstrate their utility.
All of this is a long-winded way to say that the precursors to entry-level full-time employment—schools, internships, even coding academies—only touch on a small part of what it takes to develop software professionally, and a limited-time academic environment provides, at best, a pale facsimile.
The Bygone Apprenticeship
The other factor here is that the modern employment framework places the onus of skills improvement on the worker and not the firm. The combination of:
- Reduced employment benefits, including the lack of unionization and pensions, leading to
- Much reduced expectations of staying at one company for more than a handful of years, which means
- Companies are less inclined to invest in training their workers for the long-haul
Really means that employers expect their hires to hit the ground running. Anecdotally, I’ve seen this play out through the course of a decade of hiring; while offers for new grads have definitely gone up in that time frame, the amount of extra-cirricular activities have also shot up. New grads are bringing with them multiple major side projects, extensive Github repos, annual internships, and other examples of software development outside of raw academic requirements3.
And if companies can effectively set an extensive set of pre-conditions—which, in aggregate, sound closer to what I’d expect out of a seasoned, senior software engineer than a new CS grad—for entry-level employment, then they correspondingly do not have to make the same amount of investment in their own workers. Apprenticeships, which are just a formalization of entry-level on-the-job training, are no longer supported outside of a few pockets of manufacturing.
It’s a shame, because there is still such a gap between the demand for self-sufficient software engineers who already have this skillset, and self-motivated candidates who want to fill these roles but find that there’s no entity that can help them bridge the chasm.
The Role of an Employer
Perhaps the silver lining here is that these expectations are shifting, albeit slowly and only for the most supply-constrained subdomains. Google’s efforts to teach AI and ML publicly are a clear example of filling a gap in education and training; it’s unlikely to lead to a rise of independent AI experts per se, but the corresponding internal training programs should make more Googlers eligible to become AI and ML engineers.
Surprisingly, not only do schools not do a good job teaching these skills, the biggest brand-name companies themselves (e.g., Google, Facebook, Amazon, etc.) are also hit-or-miss, in part due to the nature of their proprietary infrastructure stacks. In the long run, I hope that there will be a set of companies or training programs (beyond Waterloo’s co-op-heavy CS degrees) that are known for providing the training for the set of engineering skills that Jimmy listed above.
The more damaging issue with these expectations is that this stacking of credentials favors grads who can afford to put in this “extra credit” for future employers, which perpetuates a mono-culture we’re trying to overcome in tech as a whole.↩