"When you choose a technology, you have to ignore what other people are doing, and consider only what will work best." - Paul Graham
return programming_language == traditional_language ? "Really, you speak LISP?" : "Really, you can't speak LISP?"
Paul Graham's big break in the tech industry was his company ViaWeb, what he describes as "one of the first web applications". It was written primarily in LISP, which was uncommon at the time as back then "writing applications meant writing applications in C". But LISP was a language Graham and his co-founder knew very well, and it's constructs opened the door for them to create a groundbreaking application that eventually was purchased by Yahoo for buko bucks. Graham begins this section of his essays with this story in order to support his thesis, which I will paraphrase: The technology you use is more important than you think. Programming languages have concrete upsides and downsides between them, and often time there is one or only a handful of languages that are truly best for the job at hand. While I don't agree wholeheartedly with the thesis, I do find it interesting nonetheless.
When Language Matters (And Doesn't)
I think Graham is certainly correct to some degree. Take high frequency options trading as an example. Companies
writes programs in languages like C and C++ to execute trades on the stock market with milliseconds of timing
precision. The lower-level system calls and libraries provided by these languages as well as the speed benefits
of using them make C and C++ optimal for such a situation. However, not everyone writes programs that require
that kind of performance and accuracy. When I wish to make a fun program, I use my most comfortable language
usually, which is Python. If I am trying to learn a language, I will attempt to write that program in the language
I am trying to learn. Graham is correct in his idea that once a programmer is exposed/attached to a language,
they tend to default to that language, similar to how in learning spoken/written languages one is best at the
first they are exposed to and often struggle to grasp grammar and syntax in another language. This past winter
I began doing Advent of Code in Rust. I really enjoyed having daily problems to work on in a new langauge - it
really facilitated my learning of the basics of Rust in a sort of Duolingo style daily lesson or challenege.
However, I didn't end up completing the Advent of Code entirely, partly due to plateauing in my learning of Rust
and not finding time every day to work on the problems. I was really tempted to go back to my default language,
Python. But this is a perfect example of when programming language doesn't really matter. Sure, it matters in
the sense of my enrichment by learning something new in this case, but in the end, as long as I solve the problem,
what else matters? I think the main distinction between when choice of language or technology matters is in casual
hobbyist programming and programming in a professional setting. If the project or product you are working on
has financial implications and market competitors, then definitely choose the "best" language for the job.
But if you are just working on something for fun, you have the choice between learning something new and
expanding the number of tools in your toolbelt, or using what you are comfortable with and still achieving the
same end result.
What's Next for Programming Languages?
I found Bret Victor's talk and Graham's essays to be extremely interesting in that they highlight the rut that programmers get into when they think we have all the tools we need and any so-called "improvement" or new thing is just Yet Another Programming Language. Back in the IBM days, people who programming in binary/machine code scoffed at the idea of using assembly language and symbols in their code. We can probably say the same about today's programmers. Sure, I know Python well enough to think I could use it for the rest of my career, but shouldn't I move fast and break things? Shouldn't I try the next new thing so I don't get left in the dust like those unwilling to learn assembly? I certainly think so. In order to truly be a computer scientist, I think we all should experiment and take risks in the technology we use. That way, we can be exposed to new ways of thinking and (hopefully) incorporate any new concepts learned into our overall expertise on computer science and programming. I'll update this post in about 25 years when Elon Musk's Neuralink allows me to turn synapses into strings and consciousness into code.
This post is in response to Reading 5 of the Hackers in the Bazaar course offered at the University of Notre Dame.