Java and beginner programming courses at universities

About 4 min reading time

I’ve been interested in programming since beginning of high school, and have freelanced in web design and development during the past years. This fall I went to university for a major in software engineering. Of course we were going to learn to program, but in the beginning it was a huge turnoff for me — we were doing it in Java. Why?

Disclaimer: I’m by no means an experienced software engineer or programmer. I focus on front-end web development, and this is only my opinions on learning to program from a student’s perspective.

How do you really learn to program? Not just in a specific language, but the general idea — the logic, the ability to think in abstractions, algorithms, to see the flows of a program? There are freshmen at software engineering and computer science university programs who have never programmed before, and it’s not a simple and quick task to teach these students the idioms and thoughts behind programming. What’s the most efficient, student friendly, and most pedagogical way of teaching how to think in code?

As mentioned, I went to university this fall. I considered myself lucky, since I had programmed before. I.e., I had taken a couple of courses in high school and after that worked with web technologies as a freelancer. I had been exposed to both the academical and practical side of programming.

The three courses I took in high school taught us Java. We were first supposed to study C++ during the first two courses, and take Java last, but the school decided to go for Java for all three, in a way to let us get a firm grasp of programming and general OO principles. Java was hence the first real programming language I actively used. During the senior year I began studying other languages lightly. PHP seemed like heaven with dynamic typing in all its glory, and Javascript was a nice buddy I could see myself hang out with. Later Ruby and Python lured (but I haven’t to this day begun learning them seriously).

I had heard from other students that Java was the language tutored at this university program I was to apply for. My heart sank a bit since I had begun to grasp code minimalism, “beautiful code”, and DRY. According to me, Java isn’t really embracing those teachings.

I had a couple of theories about why, oh why, the university chose Java to teach new students:

  • It’s cross-platform. Students run OS X, Linux, or Windows, and you can get Java up and running effortlessly on these platforms.
  • It’s object oriented. The two programming courses we had during the past fall were actually titles Object Oriented Development (beginner and intermediate). Java was built with many solid object oriented ideologies in regard: inheritance, interfaces, polymorphism, dynamic binding, etc. If you have a solid understanding of the OO in Java, there’s a good change you’ll grasp any other OO language in very short time.
  • It’s widely used in the enterprise sector (or my “Who’s waiting for the fresh students after they graduate?” theory).
  • It’s quite versatile. The university can use Java to teach full blown desktop applications, database programming, GUI design, web server programming, and in recent years mobile app development (Android).

But what about the cons of teaching Java to new students?

  • Its bloated syntax might be an obstacle when learning how to program. Why should a beginner care about curly braces? Why should he or she even have to ponder over what static is? As soon as the teacher states: Don’t worry about , we’ll get to that in course X and leaves the students dumb-founded, there’s something rotten going on.
  • The lack of simplicity, and need for boilerplate code. Tasks you pull off in Ruby in a line of code can take up several lines in Java. The question is: should the students be forced to write a lot of code by themselves, or should the language serve convenience methods for even the simplest tasks? Or both?
  • Overly complicated and sometimes utterly useless design patterns. AbstractModelFactory , I’m looking at you.

I think it all boils down to: Is the possible pain and frustration associated with Java worth getting a solid foundation in object orientation, scanning through documentation, diving deep into Swing, and worrying about semicolons? Perhaps.

What’s the focus? Learn to program in general, or learn a single language? Sometimes I think my university focus a bit too much on Java. They should branch out and bring in examples from other languages once in a while. In the end, I think universities often use Java since it’s the shotgun solution: You can squeeze in quite a large curriculum with Java, it’s widely used in the industry, it teaches basic OO concepts.

If it was up to me, I’d teach Ruby to new students. Just to get a hang of basic program flows, printing and reading data to a console (I/O), calculations, string manipulations, and then move on to methods, classes, objects, et al. It would only serve as a intro tool to programming. Sure, you can’t create an Android app with Ruby, but I think it would be godsent for the totally new students. The downside? These students would have to learn Java later on instead.

One could debate wether it’s good or bad for new students to learn two programming languages in a short period of time. But when I see students struggling with why they can’t call methods on an int in Java — like they can on a String — and other things, I wonder if Java is the right choice to start out with.