Search This Blog

What You Should Learn In College, But Will Have To (Probably) Do Yourself

Final exams are over, the semester has come to a close, and all of those lucky college students now get a month of R&R during winter break. I remember those times well, and I especially remember the relief after finishing out another set of challenging courses. I loved my college experience, both the mind-expanding scholastic activities and the exciting extracurricular activities. There is no doubt in my mind that I gained an incredible amount of knowledge and understanding from the time I spent in college, but in looking back over that time, I notice that there was one gaping hole in all of the engineering and computer science coursework I completed. Almost none of the courses even came close to teaching us how to effectively manage and execute real projects.

Now, you may be saying to yourself, "Well, duh! College isn't about doing real projects. It's about learning theory and how to apply that theory to contrived textbook problems." And I would agree, to a point. There is certainly a vast quantity of knowledge surrounding software engineering, a significant portion of which a good Computer Science curriculum should teach. But I also think college students should learn how to use that knowledge in the context of actual software projects, because let's face it, most college grads are going to go work for a company where they will be contributing to projects for the rest of their careers. Only a small subset of them will go through PhD programs and cycle back into academia as professors.

Come to think of it, professors spend most of their time on projects, too. They may be research projects instead of commercial projects, but that doesn't mean they wouldn't benefit from learning how to run projects before they need to do it on the job. Not adequately teaching students how to do software development is kind of like building a car without windows or a steering wheel. The car may have lots of nice features. It could have a good, powerful engine. It might be luxuriously comfortable. But without windows you won't be able to see where you're going, and without a steering wheel you'll have no control over how to get there, anyway.

How College Misses The Mark


I can't think of any good reason why colleges couldn't teach good software development practices along with all of the theory they cover on relational databases, algorithms, networks, compilers, and everything else. Development practices are at least as important for efficiently producing high-quality software, and they're certainly not hard concepts to learn. Although they are difficult to master. Maybe it's because the concepts are perceived as easy that colleges don't feel the need to teach them, but that is a terrible disservice to these future programmers.

If students were introduced to good software development practices early in their coursework, they would have a much better chance of developing good habits that will serve them well throughout their professional careers... and their college careers for that matter. From my own experience, university programs don't spend much, if any, time going over the practical aspects of developing software. Even those courses that have you do projects that take more than a couple of days to complete don't introduce good programming practices properly.

Those courses that also expect you to do more significant projects with a group of students aren't any better because the professors still don't give any instruction on how to manage a project within the context of a group. It's as if they expect everyone to either already know how it should be done from high school group work, or they think we'll all pick it up easily on the fly because it's so obvious. And let's face it, your grade depends on it, so that should be incentive enough to spontaneously learn it.

I must admit that I did have one excellent course that involved an extensive project with a team. It was an electrical engineering course, not a computer science course, but it was cross-listed with the computer science department. I don't know of a similar CS-only course for software development. Anyway, this course had the unassuming title of "Digital Engineering Laboratory" and it had the most credits of any lab course, weighing in at a hefty four credits. Most labs were only one or two credits. The goal of this lab was to implement a microprocessor from scratch of the team's own design. The processor could do whatever you wanted it to, but it had to work on a real FPGA for a demo at the end of the semester.

Let me tell you, four credits was not enough for this course. I spent more time on this course than my other three courses that semester, combined. I'm sure my four teammates did the same. Yet, this course was different from all of the other project-based courses. Even though it was a lab, there was a small classroom component to the course, and the professor spent most of the time going over practical design, development, and project management principles. The rest of the classroom time was spent going over progress reports and addressing any logistical issues we came up against.

All around, it was an excellent learning experience that came closer to real-world project execution than any other college course I took, and they had the right idea in how they structured the course. Integrating some instruction on managing projects with the execution of a full-scale project provided enough motivation to really pay attention to the design and development practices. If you didn't, it would be much harder to succeed. But there were still problems even with this exceptional course. For one, there wasn't enough time to properly cover everything you need to know to run a project well. For another, since it was a hardware design course, there was no mention of some of the best software design and development practices.

Critical Best Practices


A few weeks ago I wrote out a list of the best software development practices that I keep in mind as I'm writing code. The practices I'm talking about now are not the same as the ones from that list. While those practices were rules of thumb that I picked up over time and use to help guide the process of developing software in a general sense, the practices I'm talking about here are four concrete methods of developing software. They happen to be agile methods, but I'm not advocating them because they're agile. I'm advocating them because they are some of the best practical methods of developing software that I've found. I wish that I had learned them in at least one of my CS courses in college, or at least had been made aware of their existence and encouraged to look into them on my own.

The first method is to define requirements for your software project with user stories. No matter what size project you're working on, user stories provide an exceptionally powerful way to organize the requirements of the software, and they are flexible enough to be useful in an ad-hoc way for more informal single-developer projects or to be extended to meet the more stringent requirements of enterprise or safety-critical projects. If I had known about user stories in college, I would have been using them all the time as an organizational tool, even though most of the time college project requirements are handed to you instead of you coming up with them on your own.

The other three methods are intimately related, and when used together they become an extremely powerful way to develop software. The first is to use version control. Always. You are using version control, right? I don't think I need to justify its use here, what with the wide adoption of Git through GitHub and the general acceptance over the last decade of version control's crucial importance in nearly everything you read about software development. Yet I never heard so much as a whisper about it in college. I can't believe I didn't become aware of it until a couple years after I graduated. Don't make the same mistake. You need to learn this stuff.

The second method is test-driven development (TDD). Writing tests first and then writing the code to make those tests pass results in much cleaner, more well-designed code that has a much better chance of working quickly. Plus, it turns out that it's much easier to write the tests for the code you need to implement and then write the code. It was only recently that I really started to appreciate this benefit of TDD because it's so counter-intuitive. It does work, though. I find myself spending a lot less time staring at the screen contemplating how I'm going to implement the next feature, and more time making good progress because the problem I'm trying to solve is so much more well-defined when the tests are written first.

The combination of version control and TDD allows for the third method, refactoring, to be used with impunity. When you have tests to back up the functionality of your project when making functionally equivalent changes, and you have version control in place to back up every change made to the code base so that you can easily turn back time, cleaning up and optimizing your code becomes so much easier and more likely to actually get done. This trifecta of software development gives you the freedom and confidence to take your programming skills to a whole new level.

Hitting The Mark Yourself


During my university program, I missed out on learning these critical software development practices. It took years after graduating to learn and appreciate the importance of user stories, version control, TDD, and refactoring, and I'll continue learning and improving. I can only speak from my own experience. Other computer science programs, or even my own if taken today, could do a much better job covering these things, but if you are not learning them in college, you should take it upon yourself to learn them on your own.

Two good ways to do this would be to participate in an open source project (or start one) or do an internship at a company that practices them. Working on a project that practices good software development is an invaluable experience, and it will teach you things that college courses will overlook or do a poor job teaching you. Sometimes doing it is the only way to learn, but you also have to be conscious of what you're learning. It's entirely possible to work on a real-world project and not fully appreciate the software practices the rest of the team is using, so don't write off studying those practices, too. There are all kinds of great books, blogs, and online resources out there for learning good development practices. Supplement real-world experience with studying to get the most out of both, and don't assume that college will teach you everything. Take the initiative and make sure you learn what you need to know.

No comments:

Post a Comment