Search This Blog

Tech Book Face Off: The Seasoned Schemer Vs. The Reasoned Schemer

Years ago I was led to the Schemer books by some of Steve Yegge's blog posts. It's been over two years since I've read The Little Schemer, but I enjoyed it so much that I always planned to read the sequel, The Seasoned Schemer. I recently made the time to do just that, along with working through another Schemer book, The Reasoned Schemer, that's not so much a continuation of the other two Schemer books as it is a tangential book written in the same endearing style as the others. Daniel P. Friedman and Matthias Felleisen wrote The Seasoned Schemer in the style of a Socratic dialogue, but in a much more whimsical way. A host of authors, including Daniel P. Friedman again, as well as William E. Byrd, Oleg Kiselyov, and Jason Hemann put together the questions, answers, and Scheme-based reasoning language used in The Reasoned Schemer. The real question is, are these two books as good as the original?

The Seasoned Schemer front coverVS.The Reasoned Schemer front cover

Tech Book Face Off: Breaking Windows Vs. Showstopper!

For this Tech Book Face Off, I felt like expanding my horizons a bit. Instead of reading about programming languages or software development or computer science and engineering, I thought I would take a look at some computer history from the business perspective. There are plenty of reading options out there in this space, but I settled on a couple of books about Microsoft. The first, Breaking Windows: How Bill Gates Fumbled the Future of Microsoft by David Bank, is about Bill Gate's hardball business tactics that won him a monopoly in the PC desktop market, but then nearly destroyed the company in that fateful confrontation with the US Justice Department and caused him to miss the Internet and, later, the mobile revolution. The second, Showstopper! The Breakneck Race to Create Windows NT and the Next Generation at Microsoft by G. Pascal Zachary, has an even longer subtitle that neatly describes the book on its own. Both of these books were written quite a while ago, so let's see how their stories hold up today.

Breaking Windows front coverVS.Showstopper! front cover

Tech Book Face Off: The New Turing Omnibus Vs. Patterns of Software

I'm churning through tech books now, finishing off a bunch that I had started a while back, but couldn't find the time to finish until now. The pair that I'll look at here are a couple of older books that I picked up through recommendations on blog posts. The first one, The New Turing Omnibus: 66 Excursions in Computer Science by A.K. Dewdney, is a survey of 66 topics in a wide range of areas of Computer Science. The second book, Patterns of Software by Richard P. Gabriel, is about advice and experiences on a variety of topics in software development. Whereas NTO is of a strictly technical nature, Patterns of Software has much more of the human aspect of working with computers and software. Let's see how these older books hold up today.

The New Turing Omnibus front coverVS.Patterns of Software front cover

What I've Learned From Programming Languages

I just finished up learning about fourteen new programming languages, and while my head may still be spinning, I've been struck by one thing about learning new languages. Every single new language I learn teaches me something new and valuable about programming. Some languages reveal many new things because they happen to be the first language I've learned based on a new programming paradigm, and other languages may expose only one or two new ideas because they overlap quite a lot with languages I already know. Every language has shown me at least one new thing, though, so I thought I'd take a look back and pick out one thing learned from each language I've encountered. Some of these languages I've used extensively and others I've barely scratched the surface, so I may miss some great insights in the languages less well-known to me, but that's okay. There's still plenty to reflect on.

Word cloud of programming languages

Tech Book Face Off: Seven Languages in Seven Weeks Vs. Seven More Languages in Seven Weeks

Yes, that's right. I learned fourteen programming languages in as many weeks. Actually, it was more like four weeks, but I just couldn't put these books down. I had wanted to work through them ever since I had read Seven Databases in Seven Weeks a few years ago and loved it. Now I finally made the time to read them, and had a blast the whole time. I shouldn't have waited so long to crack these books open. I started off with Seven Languages in Seven Weeks by Bruce Tate, and then quickly moved on to consuming his follow on book, Seven More Languages in Seven Weeks, co-authored with Ian Dees, Frederic Daoud, and Jack Moffitt. It's not as hard as it would seem to learn about so many languages in such a short amount of time, as long as you already have a few under your belt, because really the hardest programming language to learn is the second one. After you overcome the confusion of holding two different languages in your head, it becomes much easier to add another and another and another. The differences get slotted into your brain more readily, and the major hurdle becomes figuring out how to use the new paradigms and features that you run up against. Let's see how these books handled presenting all of these different languages and smoothing the shock of moving from one language to another in rapid succession.

Seven Languages in Seven Weeks front coverVS.Seven More Languages in Seven Weeks front cover

Tech Book Face Off: Confident Ruby Vs. Metaprogramming Ruby 2

I'm always on the lookout for books that will help me improve my programming skills by learning to write cleaner, clearer code. For this Tech Book Face Off I found two relatively short books on how to write better programs in Ruby. The first book is Confident Ruby: 32 Patterns for Joyful Coding by Avdi Grimm, and I paired it up with Metaprogramming Ruby 2: Program Like the Ruby Pros by Paolo Perrotta. I haven't been programming in Ruby too much lately, but I still love the language and think it's a great vehicle for exploring how to write beautiful code. Even if I don't use the things I learn from these books directly in my day-to-day work, plenty of the ideas behind good programming patterns will transfer to any language. The key is to recognize the rationale that makes a particular coding style work well—what are the underlying principles that make it better than the rudimentary way that code is normally written—and adapt that style to the language that you're using. So let's see what these books have to offer, whether you're a Rubyist or not.

Confident Ruby front coverVS.Metaprogramming Ruby 2 front cover

Building a Model for Retirement Savings in Python

It's easy to find investment advice. It's a little less easy to find good investment advice, but still pretty easy. We are awash in advice on saving for retirement, with hundreds of books and hundreds of thousands of articles written on the subject. It is studied relentlessly, and the general consensus is that it's best to start early, make regular contributions, stick it all in low-fee index funds, and ignore it. I'm not going to dispute that, but I do want to better understand why it works so well. As programmers we don't have to simply take these studies at their word. The data is readily available, and we can explore retirement savings strategies ourselves by writing models in code. Let's take a look at how to build up a model in Python to see how much we can save over the course of a career.

Disclaimer: I am not a financial adviser, so this article should not be taken as financial advice. It is merely an exploration of a model of retirement savings for the purpose of learning and understanding how savings could grow over time.

Levels of Understanding

This is going to be a story of how I learn new things and come to understand them well enough to put that knowledge into practice. I imagine these levels of understanding are similar for everyone, but I can't be sure. I'm just going off of my own experience. It's possible that for any given person or any given skill, learning could stop at any point where they've gleaned enough usefulness and the desire to continue learning has been exhausted.

However, if that desire to learn is strong, learning can develop through three main levels of understanding. The first level is the basic what to do to accomplish a task. The second level delves deeper into how things work in context to solve problems in the domain. The third and final level explores why things work the way they do. The answers to those three simple questions: what, how, and why encompass all of the understanding that can be gained from any skill, subject, or domain. To ground this discussion we'll focus on programming to make things more concrete. What is involved in each of these levels of understanding when learning how to program?

Tech Book Face Off: Introduction to Algorithms Vs. Linear Algebra and Differential Equations

This Tech Book Face Off has been a significant undertaking, more so than I originally imagined. It took over ten months to get through these two books, slowly chipping away at them in my free time during the evenings until I had finally read all of the material and worked all (well, nearly all) of the problems. As a result it's been nearly a year since my last book review, and working through these books was quite the experience. The goal at the outset of this project was to revisit a couple of textbooks from college, things that I felt I should brush up on. I wanted to see how much I retained of these subjects and how much I had forgotten that may still be useful. I chose Linear Algebra and Differential Equations by Charles G. Cullen because I remember having breezed through this subject in college without giving it my full attention. It was easy at the time, and I coasted through the course without, I thought, giving it the effort it deserved. Then I picked up Introduction to Algorithms by the famed group of professors known throughout the programming world as CLRS. I love algorithms, and I have fond memories of this course and working through this book, so I wanted to go through it again in its entirety, picking up the subjects that we had skipped in the course as well. What follows is the perspective of one programmer rereading a couple of old textbooks fifteen or so years later.

Linear Algebra and Differential Equations front coverVS.Introduction to Algorithms front cover

For the Love of Books

I currently have seven books in flight—a few that I'm actively reading, a couple that I'm trying to juggle for learning new skills, and a couple more that I started a while back but had to put aside for the time being. What they are doesn't really matter. It seems I'm always in the middle of at least a few books at a time. Please understand this is not bragging; it's love. I love starting a new book, with all of its promises of knowledge or adventure or perspective. I love progressing through the middle of a good book, absorbing enough introductory material to understand more difficult concepts or following the intricate plot twists as a story builds to its conclusion. And I love finishing a great book, reflecting on the knowledge learned and the experiences had through the written word mixed with imagination. Books are still a major source of enjoyment and fulfillment for me, and I don't see that ever changing.

Are Computers Still a Bicycle for the Mind?

Steve Jobs had an enormous appreciation for the computer, believing it was the greatest human invention, and he commonly likened it to a bicycle for our minds. Here he is in one such explanation of this analogy:


He refined his delivery over the years, but the underlying analogy was always the same. The bicycle dramatically increases the efficiency of human locomotion, and likewise the computer dramatically increases the efficiency of human thought. While that is still the case when computers, the Internet, and increasingly Artificial Intelligence and Machine Learning are used as tools to leverage our innate abilities to solve huge, complex problems, they can also become other things for the mind that are not so useful. We are seeing it happen more and more that as computers proliferate, shrink in size, and become more convenient and ubiquitous, they stop being treated as a tool and start being treated as a toy or simply as a distraction. Maybe computers are becoming less like a bicycle for the mind and more like something else.

Explore Simple Game Algorithms with Color Walk: Part 12

We've now been exploring and discussing game algorithms using the simple game Color Walk for months over the course of 11 posts. We started out extremely simple with random and round-robin algorithms, advanced to some obvious greedy algorithms, and wound up discussing a number of graph algorithms. We've discovered a ton of stuff along the way, so it would be nice to step back and review the ground we've covered to see the big picture in all of the experimentation and details of the various algorithms we found along the way.

Explore Simple Game Algorithms with Color Walk: Part 11

In this installment of exploring game algorithms using the simple game Color Walk, we're going to do something a little different. Last time we explored a number of variations and hybrids of Dijkstra's algorithm—the classic, efficient graph algorithm for finding shortest paths—and found that pairing it with a pre-run of Greedy Look-Ahead (GLA) performed better than any other algorithm we've seen so far. This time we're not going to explore any new algorithms. Instead, we're going to look into what makes Dijkstra's algorithm tick: the priority queue. Save for this variant of a standard queue, Dijkstra's algorithm is conceptually the same as Breadth-First Search (BFS), so we want to see what makes this priority queue so special and how we can implement one efficiently with a binary heap.

Explore Simple Game Algorithms with Color Walk: Part 10

We're back for another round of exploring game algorithms using the simple game Color Walk. We finally reached the point of evaluating Dijkstra's algorithm—the classic, efficient graph algorithm for finding shortest paths—in the last post. It performed pretty well against the top dogs: Greedy Look-Ahead (GLA) and the GLA-BFS hybrid, especially when it came to consistently finding the best moves. However, it failed to find the best moves when a board could be solved in under 29 moves, so we're going to see if we can squeeze out any more performance by modifying Dijkstra's algorithm further. To do that, we're going to try combining Dijkstra's algorithm with GLA, running Dijkstra's algorithm in more than one pass, and changing the heuristic we use to guide the search.

Explore Simple Game Algorithms with Color Walk: Part 9

Welcome back for more exploration of game algorithms using the simple game Color Walk. In the last post we covered the other fundamental graph search algorithm, depth-first search (DFS), the counterpart to the previously discussed breadth-first search (BFS). These graph algorithms do a full search of the graph of a color walk game, the full set of board positions resulting from each move at each point in the game. We found that running either of these algorithms to completion is extremely prohibitive due to the graph size being exponential in the number of moves. In order to deal with that exponential growth, we need to look at other graph algorithms, and we have quite a few to choose from. We'll explore some categories of graph algorithms and look at one in more detail, Dijkstra's algorithm.

Explore Simple Game Algorithms with Color Walk: Part 8

We're continuing this ongoing saga of different game algorithms using the simple game Color Walk. In the last post we started exploring the fundamental graph search algorithms with breadth-first search (BFS), because after all, the full set of move choices and resulting board positions of any game can be arranged as a graph. After looking at BFS and finding that we can nominally improve the search for shortest number of moves, on average, it's time we look at the close sibling of BFS: depth-first search (DFS). We'll quickly run into performance issues just like we did for BFS, but let's see if we can come up with a reasonable way to limit DFS so that it can be a useful algorithm.

Explore Simple Game Algorithms with Color Walk: Part 7

We're continuing to explore different game algorithms using the simple game Color Walk. In the last post we took a deep dive into other heuristics that could be used instead of the obvious maximize-the-number-of-blocks-removed approach and found that the obvious choice is actually hard to beat. After looking at various heuristics, it's time we fill in some gaps in our exploration of algorithms by looking at a couple of fundamental graph algorithms: breadth-first search (BFS) and depth-first search (DFS). These algorithms are the two basic ways to search for something in a graph of nodes (or vertices) connected by edges, and a graph is exactly what we have when we draw out all possible move choices with each node representing a move and edges connecting sequential moves. We'll focus on BFS for this post.

Explore Simple Game Algorithms with Color Walk: Part 6

What's next with exploring different game algorithms using the simple game Color Walk? We've come a long way so far with building out the game's interface to support looking at different algorithms, looking at trivial round-robin and random algorithms, and designing a greedy algorithm that picked each move based on the most blocks removed. The last post showed how far we could push the greedy algorithm to look ahead and try to pick a better move that would result in more blocks removed up to five moves ahead, but even after improving the data structure for the board, we hit diminishing returns in looking more moves ahead. It's time to expand our search for algorithms by looking at other heuristics we can use to pick the best move. The greedy algorithm used a heuristic of "the most blocks removed" for any given move, but there are others that we can use.