Computers are not getting faster
When we discuss the inclusion of a feature in Lua, we ask ourselves, "OK, but will it run in a microwave oven?"--Luiz Henrique de Figueiredo
I recently started working with John Fremlin's TPD2 HTTP server, one of the fastest web servers currently available. TPD2 uses a lot of interesting (at least from a Lisp point of view) techniques to achieve high performance. This reminded me of something I noticed in Masterminds of Programming: both Chuck Moore and the creators of Lua emphasized the tendency of computers to scale down.
When laptops and servers are getting faster processors with larger caches and more cores (which doesn't really help because we don't know even know how to think about writing software for them), it's easy to overlook the fact that we're trying to imbue ever smaller objects with computational intelligence. Moore provided the extreme example of a chip having cores with 64 words of RAM and 64 words of ROM memory each. A less extreme example comes in the form of smartphones.
At the same time use of existing systems is increasing to the point where some people are measuring how much servers they buy by the number of tons of waste their packaging generates. If each of your servers could handle three times as many requests per second, you could fit it on a computer three times as small, or have a datacenter a third of the size. The latter is a pragmatic consideration applicable at any time, but the former enables entirely new patterns of use and interaction.
No comments:
Post a Comment