Here’s why you should learn to code on a potato [Rant]

Okay, okay, let me be clear: I’m not talking about the delicious, versatile, edible, literal potato. I’m talking about the metaphorical potato – which is a term sometimes used to describe low-end computer hardware.

Now why, you may ask, should you learn computer science on dingy, old, slow, and underpowered hardware? Shouldn’t you strive for the most powerful piece of silicon you can get your grubby little fingers on? See, I beg to differ – I personally think that being limited by your hardware can make you a better programmer.

So what’s led me to this observation? The answer is CS 2261 at Georgia Tech. CS 2261, officially titled “Media Device Architecture”, is a course essentially dedicated to learning about the intricacies and programming games for the Game Boy Advance (GBA). The GBA is, affectionately speaking, a potato. It wasn’t particularly powerful when it released in 2001, and over two decades of technological advancements since then have not done it any favors. That’s not to say the GBA is bad hardware by any means – it was remarkable when it was released, balancing good performancem, low cost, and excellent battery life. However, the single-core ARM7TDMI clocked at 16.78 MHz is ancient by today’s standards.

Programming games for the GBA, however, has made me an immensely better programmer. Working around the limitations of the old clunky machine (via an emulator) has taught me a lot about making code more optimized and efficient. When you’re working with hardware that stutters and chugs when you draw one extra rectangle, you really start to pay attention to just how inefficient code can be.

I feel that as hardware gets faster and faster, software developers tend to care less and less about making code refined. Especially in the age of iterative development and Agile, it seems that more and more emphasis is being placed on just getting working code out into production and implementing as many features as a team can cobble together. It’s not such a big deal, as the hardware is powerful enough to make inefficiencies unnoticeable.

I think this way of thinking is born from an unfortunate combination of human laziness (I certainly am) and industry expectations. The innate human desire to seek out the shortest path to completion and the industry’s insatiable thirst for more and more features combines and compounds to create absolute monstrosities. However, I fear that this way of doing things is unsustainable.

At the end of the day, computers are magical little boxes that uses electricity to turn inputs into outputs. However, we live in an age where humanity heavily relies on computation, and in an age where energy usage is becoming an increasing concern. One line of poorly optimized code might be insignificant, but an overall bloated software base probably contributes to a whole lot of energy waste.

Not to mention, making games for an iconic console is pretty cool too.

More like this

HackMITea: A Hackathon From the Eyes of a Physics...

Recently, I've had the opportunity to visit a friend up north and experience my first ever hackathon,...

Diary of an Undergraduate Thesis

The Georgia Tech Research Option is available to all undergraduate students in the College of Sciences. It...

A day in the life of a field ecologist

Details from my summer experience conducting fieldwork in the Colorado Rockies Photo caption: View of the Rocky Mountain...