Member-only story
The Evolution of Coding
How the way we code and learn has changed over the decades
Ever since coding has been a professional occupation, it’s evolved. Even within the short two decades of my experience, I’ve noticed change.
I’ve decided to share what I’ve seen with you all to see if you agree with the change I’ve observed.
Learning to Code Through College vs. Through the Community
In the past
We studied the core of computer science, like programming paradigms, software engineering, data structures, computer systems, operating system concepts, etc.
Specific operating systems — e.g., Windows — and programming languages — e.g., C++ — were something we learned ourselves.
We wrote documents, analyses of our requirement gathering, and prepared for presentations — then, we wrote the actual program itself. We found references from the library and printed our theses in think pages to hand in as assignment completions.
A degree in computer science was the basic requirement before one could apply for a programming job.
Now
It’s nice if you have a degree from a university, but a candidate is selected based on how well they can program. It’s fine if you don’t have a degree as long as you code optimally.
Core computer science concepts are nice background knowledge but more important is your experience in using the tool — e.g., administering the AWS system or having a strong command of a specific programming language.
Even if you don’t have a thesis or degree, if you’ve built an app used by many people or have a popular blog on development, you’re regarded as an expert in the field.