August 28, 2011

Programmer myopia

We like to assume that people are basically competent and rational. Computer programmers enjoy pretending they are more competent and rational than people in other professions.

In many cases neither of those assumptions is true.

Two popular memes surrounding programming languages in the 80s and 90s were the assertions that "garbage collection is too slow" and that "dynamic typing doesn't work for large programs."

Many programmers were convinced both those things were true. In hindsight they were completely wrong, as languages such as Tcl (more on the importance of Tcl in the history of programming languages in an upcoming post), Perl, Java and Python "dragged people halfway to Lisp" and changed public perception.

How could so many people who consider themselves above-average in competence and rationality be so wrong? (Assume that every programmer is indeed a special snowflake and the Dunning-Kruger effect doesn't apply).

A hermit spent 10 years writing a program. 'My program can compute the motion of the stars on a 286-computer running MS DOS,' he proudly announced. 'Nobody owns a 286-computer or uses MS DOS anymore,' Fu-Tzu responded.

Eloquent JavaScript, Marijn Haverbeke


The problem is that programmers seem unable to think even a couple of years into the future. People complaining about garbage collection in the 80s were looking back at their existing 8-bit Trash-80s instead of at contemporary computers being produced and the future computers being planned. The idea that computers can be good at automating rote tasks like managing memory and checking and inferring types never occured to them.

People have trouble imagining the future even if the trends, such as Moore's law, are in front of them. It takes a very long time for people to understand the right ideas. Just ask Alan Kay. Being able to find the appropriate point of view really is better than a high IQ.

Here are some other Lisp concepts that programmers believe out of ignorance that will take a long time to dispel:
  • tail recursion is unnecessary and makes debugging difficult
  • macro-based metaprogramming results in unmaintainable programs

9 comments:

Anonymous said...

Re: tail recursion

IMO the name is unfortunate and most implementations are just too finicky. "If these N things are true, the compiler will do a tail call." Miss one and it silently becomes a normal call.

Why not make it explicit like (reuse-stack (f x))? This is good for the human in many ways -- easy to see, easy to disable, compiler errors when prerequisites are violated, etc.

Anonymous said...

...Could you please check the last set of items? I'm not sure, but I think you've double negatived yourself...

If macro based metaprogramming is something to be avoided, why did you put your hacker news link in which says macros are really powerful??

I am slightly confused.

Anonymous said...

> Here are some other Lisp concepts that programmers refuse to believe out of ignorance that will take a long time to dispel:
>
> * tail recursion is unnecessary and makes debugging difficult
> * macro-based metaprogramming results in unmaintainable programs

I think there's a logical typo: is not the macro thing a misconception people believe in?

mtt said...

A hacker wrote an interesting application in Common Lisp. He proudly shows it on Hacker News. "Nobody uses Lisp anymore..." the crowd responded. ;)

Anonymous said...

>macro-based metaprogramming results in unmaintainable programs

do you think it holds true?

it's a Lisp way of programming to use macros here and there after all..

Anonymous said...

The problem has never been thinking "garbage collection is too slow", the problem is thinking "garbage collection is always too slow". Thinking "garbage collection is never too slow" is as bad a generalization.

As for Fu-Tse... If it runs on old hardware, it can be made to run on new hardware. And these days, you'd just use an emulator or VM anyway.

As for thinking a couple years ahead, there is no point to doing that. BY then, you'll be out of business... or a job. Unless you're in academia, you must deal with the current state of things.

Attempting to write software for hardware that does not exist is a poor business plan.

I_Artist said...

Whoa. That was the dumbest thing I ever read. Let me explain why...

That program that can compute the motion of the stars on a 286? Well, it can also run on a 8Core CPU today, it will also smoke all the other applications out there. Because the developer who made it was competent enough to meticulously manage his memory and types. The CPU wasn't constantly checking to see if this or that piece of memory was still needed, the programmer did. And that memory management was ONLY called when needed.

No matter how fast CPUs get in the next 50 years, the fact is that GC adds processing that is needed.

And more importantly, GC doesn't always work. Either the paradigm works 100% or you shelve it. "Oh, you need to force a GC here", well what is that if it is not memory management? But it's worse, because now I have to know when my memory is used AND if GC will handle it correctly. Assuming that GC will always work is foolish, not engineering.

The same argument applies for types. It's adding CPU cycles for no reason. Aren't you an expert programmer? Don't you know what type you are using? And again, it doesn't always work. Sometimes you have to force a type one way or another because of what you want to do with it.

Both of these "advancements" in languages were made for neophyte programmers that can't handle the complexity of real programs. So, in order to cater to lesser programmers, we spew out less efficient applications. This is a big trend in the computing field, one that is already showing signs of rust and decay.

The first satellites were launched into space with 2K or memory. 2K! A satellite!! Now, with the "advance" of Java I have a 1.2Ghz cpu in my phone that can't handle 5 applications running on it without bogging down like crazy.

Let's look at this for a second. Look at that hardware. I have a 1.2Ghz CPU, 800Mb RAM, a capacitive touch screen, a GPS, a microphone, two speakers, an 8Mpx camera, a 1.3Mpx camera, geo spatial sensors, a USB port, a 3.5 jack, a 16Gb MicroSD card, SIM card all inside of a shell that I fit in my front pocket. THINK about that. That's amazing progress, that is something that we could not conceive of years ago.

But the software? 5-6 apps (apps, NOT applications, not a word processor, CAD, or a real graphics application) and the thing slows down.

The point is: using that kind of philosophy and programming techniques Fu-Tzu couldn't even make an application that could compute the movement of the stars using CPUs made 10 years from now.

P.S. I wanted to sign this with my Google account, but it was making me sign up for blogger. (!) I go by I_Artist, my name is Mike Wallace. Don't ask me what URL this wants...

Casedeck said...

"Here are some other Lisp concepts that programmers refuse to believe out of ignorance that will take a long time to dispel: "

Perhaps my mastery of English is inadequate (not my native language). Could you please clarify this part of text? The statements listed below it are false, the opposite of them is true?

Vladimir Sedach said...

You guys are absolutely right.

"programmers refuse to believe out of ignorance" should read "programmers believe out of ignorance."

I need to get a second party to proofread posts before I post.