Grumpy Old Man link

In responding to a blog comment about Java programming these days.. which was a lot like C programming in the 1980's (everything is an OS with these programmers, etc).. I ended up with a lengthy diatribe.. which I figured was good enough for a blog post:

[The state of generational programmers has always been the case where some set of people think that everything is a "web-app", "OS", "AI program", etc.] It has been the case for pretty much every language when it gets to a certain size. Everything was Fortran or Snobol in the 1970’s. Everything was a LISP program in the 1980’s. Everything was a C program in the early 1990’s and then C++ in the late 1990’s. Now everything is either C# or Java.

Industry wants to standardize on a small set of languages… mainly because its hard to figure out how to manage people who are writing a project in Haskell, Perl, LISP, C, and Python… how do you count defect rates? which people get paid what wage? Are things being efficient enough for our budgets and deadlines? Will competitors be able to take our code and steal it? etc etc.

Industry then chooses some language for X years, and tells colleges, we want more people who know this. Colleges then gear their programs towards that as that is grant money etc.

The big issue though is how do colleges teach these courses? A good many places I have been.. you have a set of professors who have taught “how to write a Shell Sort” in whatever language is around.. and then they teach students a bunch of stuff that they aren’t going to do as efficiently as provided libraries. Or they will do catchup of how to make Java into Snobol in their heads and use whatever shortcuts (as in its all in the standard library which is what I remember in a C++ course in the 1990’s).

Very few colleges try to teach the budding engineer/scientist how to think, it is sort of expected that you know this already (Just use your common sense… was a reply I heard in the 1980’s to the question of how to solve a problem). The thing that CS courses should teach is the same as other engineering and science courses.. how to think like a computer, how to break a problem down for the computer to access it, what are the first principles of the issue? what are the tools that you can use to solve it? When do you need to try something better and when do you use existing practices? How to QA your or someone elses code? And then the final part, how to interact with the customer (more MBAish, but needed today more than 30 years ago).

These are things that I have seen over and over again lacking in people coming out of colleges with CS or CE degrees. Instead you have to spend 3-4 years breaking in people before they would be the equivalent of graduates from other sciences or engineering courses.

[Bitter old man comments free of charge and under a CC attribution required license]