Your field is moving so fast! How do you keep up?

I get the weirdest comments regarding my job and/or education sometimes. Let me give you an example.

“I don’t envy you Luke,” said my coworker “I mean technology! Man… It moves so fast these days. It’s crazy. I’d never make it in a field like that. By the time you get out of school everything you have learned is pretty much obsolete”

How do you respond to something like that? It is not only preposterous, but also a very ignorant assumption.

It is true that ours is a fast moving field. There are few other research areas that advance with such a clockwork regularity. In the last 50 years we saw nothing but exponential increase in the processing speed of our integrated circuits – and because of that in raw computational power. Every 18 months we release hardware that is at least twice as good as the one you have right now. This is something unprecedented.

But this does not mean that every two years we reinvent the wheel and start from scratch. Yes, we do get new technologies introduced to our stack, and the environment we work in becomes more complex. But the underlying body of knowledge does not become obsolete. Ever!

For example, every computer (from that big cluster in the server room, to your iPhone) is based on Von Neumann architecture – which was conceived in the early 40’s which was grounded in the theoretical work done by Turing in late 30’s. The very concept of programmable, all purpose computer dates back to 1800’s and can be attributed to Charles Babbage. These ideas are still at the core of information technology. They didn’t become obsolete.

Same goes for software which moves even faster than the hardware. It’s true that we made great leaps in the fields of software engineering, operating system design, compiler design and programming languages. Still, no matter where you are running your code and what language you are using – when you need to sort a list, you will probably want to use the quick sort algorithm. Why? Because it’s fast and we haven’t really come up with anything more efficient yet. Similarly, a 50 years from now computer science students will still be learning about the traveling salesmen, knapsacks, wonder if P=NP (or maybe not, who knows…).

No one makes similar assumption about other fast moving industries. No one thinks that chemists and pharmaceutical specialists must re-learn their science every couple of years. No one assumes that doctors must re-learn medicine each time we make some new discoveries. No one thinks that automotive industry reboots itself every once in a while.

But computers are magical and they defy logic and common sense.

This entry was posted in Uncategorized. Bookmark the permalink.



6 Responses to Your field is moving so fast! How do you keep up?

  1. road UNITED STATES Mozilla Firefox Windows says:

    I don’t think this phenomenon is limited to computer science. My sense is that every scientific/engineering field has a “log-phase” in which its growth is extremely rapid, that lasts for a couple of decades. CS’s log phase is now. You mentioned chemistry, its log-phase was probably ~1930-1980. I’m not a physicist but my sense is that its log-phase was 1890-1960. I’m a biologist and I can assure you that doing biology today barely resembles doing biology in 1989 (pre-genome, pre-knockout, pre-RNAi). Biology is *definitely* in the middle of a log-phase and I’m certain that all of the techniques I learn now will be irrelevant/quaint in 10-15 years. And medicine probably isn’t as static as us non-doctors imagine…

    Anyway that’s what makes these careers fun. If I didn’t have to constantly keep up with new developments and learn new things every day I’d go insane with boredom.

    Reply  |  Quote
  2. Ian Clifton UNITED STATES Mozilla Firefox Ubuntu Linux says:

    Just respond with “I like to learn” and you’ll either get awkward silence or a lot of back pedaling.

    Reply  |  Quote
  3. even better… just think about something “easy” like calculating payments
    i mean… 1+1=2 is true since nearly ever
    but… every year the whole set of rules for taxes and similar changes a lot… without any need. If you just learned the old ones you wont need them if you are done learning any more.

    I think every job where the work consists of thinking and learning will need that you think and learn every time again. If you could stop thinking at some point it would just become plain, stupid work… every machine could do it better.
    But it wont! and so you will get paid…

    Reply  |  Quote
  4. Mart SINGAPORE Mozilla Firefox Windows Terminalist says:

    Concepts will never go obsolete. I believe that programming concepts are more useful than learning programming languages. Once you understand and know the concept, it’s just a matter of googling and practising the syntax.

    Reply  |  Quote
  5. jambarama UNITED STATES Google Chrome Windows Terminalist says:

    The argument, as I understand it, says that changes in IT require more retraining and relearning than in other fields. This argument has three explanations I’ve heard. Either (a) the changes in IT are more dramatic, so the same amount of changes make more fields obsolete OR (b) the rate of change in IT is faster than other fields and proportion of fields becoming obsolete is related to the rate of change OR (c) the amount/breadth/depth of knowledge required to be employable is greater than other fields so the same number of changes affect more people. I haven’t seen anything showing IT is changing faster or more fundamentally than some other field I don’t understand, so I’m not convinced of any of these propositions.
    IT has never had a fundamental revolution, like math with set theory, physics with just about anything since Einstein, or biology with Darwin. Maybe that is because IT is still in its infancy, but the point stands. Like you said, much of older IT stuff just doesn’t become obsolete. So I’m not sure the changes are more dramatic, we’re still using all that old stuff you mentioned (Turing and Babbage). Heck, I know a guy who came out of retirement to program COBOL because the jobs were paying so well. So I don’t think (a) is right.
    And everything is changing very quickly now I don’t get the sense that IT is unparalleled. It is changing fast, no argument here. But every industry where you use your brain is changing and growing rapidly. Not just math and hard science, but agriculture, art, law, teaching, medicine, accounting, and any part of serious academia (anywhere papers are published). Other fields may seem slower at a distance, but I don’t think that is true when you get down to it. So I don’t have any reason to think (b) is right.
    Last, to be useful & employable in a field, you don’t need to keep abreast of *every* change. I suspect IT is a very broad field, but not typically a very deep one. If you want to learn flash, you don’t need to read Von Neumann. To learn set theory, you need to start at addition. That comparison isn’t entirely fair, to make a processor I suspect you really do need to understand Turing and Babbage and Von Neumann. But it seems like complexity is growing more horizontally than vertically.

    Reply  |  Quote
  6. Luke Maciak UNITED STATES Mozilla Firefox Ubuntu Linux Terminalist says:

    @ road: Heh. Good point about the log phase.

    Still, our changes tend to be more superficial than say in Biology or Physics which are the branches of science that seek to unravel the mystery of the universe. In these fields you can actually discover something that will shatter all existing theories and cause rippling changes throughout the whole community.

    Computer science is more about inventing things and improving what we have now and developing new approaches. Every once in a while we do have something that causes an industry wide shift (like OO paradigm) but most of the time CS evolves via slow feature creep.

    @ Ian Clifton: Good one. I will have to try that. :)

    @ Dr. Azrael Tod: You are right. Pretty much every job out there requires some amount of thinking and learning. Even blue collar jobs – car mechanics need to learn about new models for example.

    I guess people don’t think about it that much though – they consider it normal to do some on-the-job training, or go to conferences where they learn about new tools, techniques and etc. But “technology is moving fast” is sort of a meme that people seem to latch onto.

    @ Mart: Exactly. Of course to learn the concepts you need to learn a programing language first. Or rather you learn both things simultaneously. And the choice of your first programming language will impact how easily you will be able to grasp some of the more advanced concepts. But yeah – once you know how to do something in one language, you can then translate it into another language.

    Sometimes this is not desired though – for example, you actually don’t want to take your experience from imperative languages and apply it to a functional language. So when you are learning lisp for example, it’s better not to fall onto old habits. :)

    @ jambarama: Actually if you are programmer, you probably instinctively know Von Neuman architecture: there is a CPU and a generic, multi-purpose memory where you put both the instructions and the data.

    I could draw a similar parallel as with math – if you want to learn to program you need to start by learning about the binary system. You could go without it, but then you will be bewildered by things like integer overflow for example.

    But yeah – my point was that we are actually not growing as fast as people think we are. We do have Moore’s law which is sort of unique to our field – but that’s just hardware. But that’s what people see – they buy a computer and it’s obsolete a year later. So they assume that software moves at similar rate – but it does not. It moves slower – probably at the same pace as most other scientific fields.

    As you said, we haven’t really had any ground breaking revolutions yet. If quantum computing ever becomes practical it will be our first bit one. The jump from binary logic to the quantum logic with qubits will be a big jump that will change both the way we build hardware, and the way we write software.

    Reply  |  Quote

Leave a Reply

Your email address will not be published. Required fields are marked *