DEV Community

Discussion on: Should university teach current or future technologies?

Collapse
 
vgrovestine profile image
Vincent Grovestine • Edited

I'll echo what others have already said: University courses are meant to deliver the fundamentals; thus, leaving students with a solid foundation to build upon in the future.

As someone who has worked in higher-education for the last decade, another tidbit that's worth bearing in mind is that--in my experience, at least--curriculum development happens at a magnitude of time slower than "the real world". Rather than choose a cutting-edge stack, educators will often opt for established technologies to demonstrate theory and principles.

Course redevelopment is time-consuming! Given the ever changing state of today's tech, faculty could find themselves overhauling their teaching materials every. single. semester. in order to keep up with modern trends. And it isn't just the core material that comes into play, faculty must be competent with the stack/tools they choose to teach.

Donning my grumpy old guy hat for a moment...

When I came through the university system, procedural ANSI C was the core language taught at my school. I learned the usual assortment of data structures, flow control, sorting, recursion, algorithmic analysis, etc. and still draw upon those solid fundamentals today some cough 25 years cough later. My officemate is younger than me; his CS undergrad was grounded in Java. And the current cohort of CS students at my institution are using Python for foundational work.

Modern...? Not always. Timeless... For sure. :)