Programming: What Makes a Good First Language?

My adventures in programming began in the Fall of 2011 when I took Nancy McCracken’s IST 256: Application Programming for Information Systems course. I was almost instantaneously hooked. I had taken only a handful of School of Information Studies courses at that point and had yet to find a concentration area that truly clicked. I liked networks, but I didn’t love them. I liked systems administration, but I still longed for the feeling that I had built something on my own—without Active Directory’s help. I began to believe that programming could be my niche after finishing my first project and not only enjoying it, but wanting more.

As the semester went on, I continued to love the feeling that I had created something, but I began to grow frustrated with the programming language in which the class was taught. Java is a wonderful, robust and sometimes trying programming language. I had no experience with a different language, but I couldn’t help but feel that there must be an easier way. I spent too many nights wondering ‘what on earth is a buffered reader?’

Java was soon replaced in my heart with a new programming language.

Fast forward to this past Spring semester and I had accepted a position as the Teaching Assistant for the iSchool’s Java Applications Programming course as well as an intern role this Summer at Rounded Co. in the Syracuse Tech Garden. I went to a few orientation sessions at Rounded Co. and was introduced to the language that would change everything for me: Ruby.

I began learning under Brian Weinreich’s instruction and he recommended sites that would be useful in my endeavors. Thanks to and Rails for Zombies, I quickly gained traction and learned to do things in Ruby in 5 lines of code that used to take me 15, maybe 20 lines of code in Java. I was completely blown away at how much simpler Ruby’s syntax was than Java’s.

Well then, why aren’t all classes taught in Ruby?

Although I will admit that I haven’t touched Java since I wrapped up my responsibilities as a TA, I am not here to tell you to skip out on Java or C in favor of ‘easier’, startup-worshipped languages such as Ruby or Python. Unless you’re looking to learn a language based strictly on how cool the name sounds. In that case, go with Python every time.

However, if that’s not the case, I would suggest not overlooking Java or C. They’re complex languages, and they cover a large number of features that are present in other languages. I think that I appreciate – and am so enamored with – Ruby because I came to it already knowing the basics. Not only that, but I appreciated the simplicity and cleanliness of the Ruby syntax. I believe that any first programming language is going to be a hard one. I was fortunate enough to have the opportunity to learn mine in a college environment where I had someone making sure that I wasn’t falling behind—and most importantly—that I was doing the work.

I tend to be an independent learner. This is a trait I think is shared widely throughout the tech community. But, no matter how little structure you feel you need to learn, I would still recommend learning your first programming language in a university, community college, or otherwise ‘in-person’ environment if you can. Being able to raise your hand and ask why your code won’t run is infinitely easier than searching through countless forums online. Additionally, colleges tend to teach languages like C and Java. They may be more difficult, but you’ll benefit in the long run from learning the basics with them. Not only because many job opportunities will look highly upon you knowing them, but because they will make it easier to broaden your horizons once you decide to move on to your second language, or your third, or your fourth….

Now that you’ve heard about my experiences as an amateur programmer, I’d love to hear about yours. Did you have success with a different approach? Reply in the comments, email me at or tweet at @samiiruddy, I’d like to continue the conversation!

Samii Ruddy

Samii Ruddy is an undergraduate student at the Syracuse University School of Information Studies. When not obsessing over technology, she can be found reading about politics, collecting comics, or attempting to longboard around campus. You can reach her at @samiiruddy on Twitter.

More Posts - Twitter

  • Ancurio

    My first language when I started learning as a 12 year old was actually Ruby, and to this day it has a special place in my heart =)
    No seriously, looking back, especially considering that I taught myself everything with the help of a book,
    I surely appreciate the fact that Ruby is a high-level interpreted language, because it hides many things not essential for basic
    concepts of programming, and it truly does have a beautiful syntax.

    Java was also the language of choice in my high school classes, and as a somewhat more experienced programmer being able to carefully examine the effect it had on the first timers around me, I have to say that teaching OO programming from the beginning is not a good thing.
    Why? Because to really appreciate it and fully grasp its modern approach to modelling problems, you HAVE to have used some sort of procedural coding first. Ruby is really cool in that aspect because it allows you to completely ignore classes while coding your first programs.
    This is what I did too, and only after dealing with OO in school through Java did I really notice that Ruby offers similar capabilities.

    About your question: I have to disagree. From my experience as the guy in my computer class in high school who anyone would run to if their Java code didn’t compile, I think many people, even if you exactly tell them what’s wrong with their code, will have a hard time truly grasping it.
    No matter how often I explained to the same people what a null pointer exception meant and how to look for causes of it,
    they called me again and again for the same problems.
    I think anyone should go through the trouble of seriously debugging their own code at the beginning (even if it meant placing hundreds of print commands everywhere), because this is a valuable experience.

    • I definitely have to agree about debugging your own code being a great learning tool. Nothing helps me remember what to do next time like figuring it out on my own. It’s always been how I learn; I need things to be hands on in order to remember. When I just can’t seem to get it on my own it’s nice to have someone there to help, though. But then I usually have to do it on my own a few times so that it’s not just a quick fix that I forget about later on.