My path toward a career in software development began at age 11 when I got my first computer. At the time computer programming as a job was a little like nuclear physicist; they existed but were rare. By the time I got to the point of higher education there were technical colleges offering information systems certificates.
During my high school years I developed my programming skills writing messaging software for the school network of BBC computers. Later I developed bulletin board software. A friend of mine wired up my 300 baud modem to make it possible to automatically answer. This was all before I attended a technical college.
In my view the brains of teenagers are very plastic. The ideal time to start teaching programming is early, about ten. Make the tools available and don't patronize them with simplistic tools that assume they are not able to understand 'adult' languages. At eleven years old I was perfectly capable of understanding the languages such as BASIC, C and even Assembler.
Academically we have put software development in a box, treating it the same as other professions. But software development is a broad skill that is being used in many fields. It is a force multiplier. Thus I think we need to give all kids access to learning materials and a path to professional developer. I see this occurring in three phases.
The first phase, from about ten to leaving school should involve learning how to program. Schools should support a voluntary curriculum to give students the opportunity to learn. Ideally we should have software clubs similar to code clubs that exist outside schools.
Second phase for those wishing to become professionals should be a technical institute where students will be introduced to aspects of software development they would not be familiar with from their school days. The main difference here is learning how to work in teams towards a single project. It would involve training in the disciplines of software development. For me this would be Agile Methodology and Scrum. This phase would be for no more than a year.
The third and final phase would be a paid apprenticeship of one or two years where the new developer would be involved with real projects under the tutorship of experienced senior developers. At the end of this you will have a competent intermediate developer who can be trusted to cut code and work within a team.
The four year University degree does not in fact prepare you for being a professional software developer in my view. At best it gives you a ticket into your first job where you will begin learning the ropes. It will take a couple of years further to become an intermediate developer.
The University track will therefore take between five and six years, while a technical college track would take two or three years to reach intermediate. Also, you will be earning after only a year with the technical college track, while you will only be able to earn after four years with University. Then there is the significant savings on educational costs.
There is of course more to University than getting qualified, so if you can afford a broader and deeper course in computing then this might be of benefit. But if your aim is to make a living and enter the workforce as quickly as possible it is a questionable use of time.
As an employer formal qualifications now mean very little to me. They might get you through the door, but so too will be practical examples and demonstrations of genuine competence. Am I the typical employer? Perhaps not. Large corporations may be less flexible with their job requirements. But smaller outfits tend to be more energetic and care more about competence than qualification. Just ask Bill Gates.
So what is your view? Am I totally off base? If you are an employer what is your view on four year degrees?