DEV Community

Should university teach current or future technologies?

jordan M.R on November 16, 2019

So at my University we are being taught technologies like PHP and JQuery for our web development module. However, if current trends continue these ...
Collapse
 
5anthosh profile image
Santhosh Kumar • Edited

May be , but They should focus more on data structures, Algorithms , compiler designs, Theory of computation, OOPS, Mathematics with more current practical training

Technology always become obsolete in 5 or 6 years and Basic knowledge don't ,so they should teach those which will help students in long run.

Collapse
 
yoadev profile image
Younes

I think that they should teach concepts and to learn by yourself because everything changes sooo fast that you need to learn without a teacher, with a book or by doing research on Internet for example.

A lot of people gets their degrees but when the technology they learned become obsolete, they can't do anything else and that is sad because we need right now a lot of people to build our unknown future.

Collapse
 
lbeul profile image
Louis • Edited

I personally don't think so. They should teach you the basic concepts of computer science, maths and programming. It's easy to learn React just by reading its docs and you can pick up literally every framework or language in a few days if you know how to code in general.

Computational thinking and problem solving, however, are often left behind in those docs or tutorials. So university should teach you exactly that - a solid framework of knowledge and understanding, on which you're able to build up anything you want!

Collapse
 
jordanmr profile image
jordan M.R

I think so, it is important to spend a little bit of time exploring the most popular tech used right now. But, I feel a lot of grads will find once they leave uni they will need do quite a lot of catching up. As things like docker; JS frameworks or scss are not even given a footnote. a lot of people in my year don't even know what "response websites" mean.

Collapse
 
jordanmr profile image
jordan M.R

WOW, thank you everybody who contributed.
I did not expect so much love on my first post. ♥♡♥

I agree with all of what you guys are saying. A strong and clearly understood foundation for the core concepts like algorithms; date structure and more is far more important than what technologies you can write down on your CV.

I guess in the end it really doesn't matter what technology you use, but how you use it.

Collapse
 
ferricoxide profile image
Thomas H Jones II

You're not citing "future" technologies.

That said, it's a bit difficult to project which truly "future" technologies will gain meaningful traction within even as short as a five year horizon. Even if you're correct on picking the eventual winners, it's likely those winners will have non-trivially changed by the time there's a strong demand for them.

Overall, far more critical to be taught "fundamentals" and other portable skills than to be concerned about a particular implementation of those skills. When you know what underpins everything, picking up the overlays is fairly trivial.

Collapse
 
jordanmr profile image
jordan M.R

True, they ain't future technologies.
But.... I felt it was a lot better than calling PHP and JQuery old; out dated or fallen out of popularity.
Plus everyone has to add a little bit of clickbait to their titles 😉.

But I agree with you. Nobody really knows what tech we will be using in the next few years and it's better to focus on what won't change, the fundamentals.

Collapse
 
idanarye profile image
Idan Arye

What my university did is teach us how to learn. We had a Python course (well, more like a subcourse - it was embedded in a bigger course that taught basic programming with C/C++) where they didn't actually teach Python, but instead had us follow an online tutorial and do assignments. So, more than a Python course, this was a course about how to learn new technology.

Now, Python is hardly a "future technology", but the same concept can still be used - teach the students how to learn a new technology, and they will be able to use that skill to learn actual future technologies once they need to.

Collapse
 
kopseng profile image
Carl-Erik Kopseng • Edited

It is not like this is a new idea, I was thinking along the same lines when I was studying 15 years ago, but you are starting at this in the wrong end. It is not about which technology to choose, but why you are taught technology to begin with.

A university is different from a vocational school or a coding bootcamp. Its first and foremost mission is not to make you a "PHP Coder" or a "Java code monkey", but to train you in computer science to make your skill set outlast the latest tech fad. Which technology is being used to achieve that is a bit beside the point. With that mindset, a tutor will with good reason choose unconventional languages such as Scheme, Prolog, OCaml and Haskell to best teach the theme of the current class. This bit is HARD and requires THINKING, something you never have time for after you start working and you mostly browse through some "Introduction to Redux" or whatever the latest tech might be.

And even if you usually end up with something marketable, everyone interviewing a junior straight out of college/university knows that you need some training anyway. That is not a big issue.

If you do choose to look into some tech, focus on the fundamentals there, as well, as I have let several juniors in interview situations go because they knew jQuery (), but not the prototypal nature of Javascript and basic vanilla EcmaScript 5.1

Collapse
 
vgrovestine profile image
Vincent Grovestine • Edited

I'll echo what others have already said: University courses are meant to deliver the fundamentals; thus, leaving students with a solid foundation to build upon in the future.

As someone who has worked in higher-education for the last decade, another tidbit that's worth bearing in mind is that--in my experience, at least--curriculum development happens at a magnitude of time slower than "the real world". Rather than choose a cutting-edge stack, educators will often opt for established technologies to demonstrate theory and principles.

Course redevelopment is time-consuming! Given the ever changing state of today's tech, faculty could find themselves overhauling their teaching materials every. single. semester. in order to keep up with modern trends. And it isn't just the core material that comes into play, faculty must be competent with the stack/tools they choose to teach.

Donning my grumpy old guy hat for a moment...

When I came through the university system, procedural ANSI C was the core language taught at my school. I learned the usual assortment of data structures, flow control, sorting, recursion, algorithmic analysis, etc. and still draw upon those solid fundamentals today some cough 25 years cough later. My officemate is younger than me; his CS undergrad was grounded in Java. And the current cohort of CS students at my institution are using Python for foundational work.

Modern...? Not always. Timeless... For sure. :)

Collapse
 
cadams profile image
Chad Adams • Edited

Yes we learned AngularJS in college. By the time I graduated it was dead.