Why Windows Vista sucks and the Red Queen rules
Sunday, December 28, 2008
When I bought my ThinkPad X200 laptop, I ordered it with Windows Vista Business 64-bit, because I wanted to have 4 GB of RAM. The machine itself is a computing power-house (at least it was 2 months ago): it has a 2.4 GHz Dual Core Intel chip, 4 GB of RAM, a 160 GB 7200 RPM hard drive, an embedded web cam, an a/b/g/n Wi-Fi card, a blue tooth antenna, a finger print reader, etc., etc. Nothing could stop this machine from super-computing, except for, of course, Windows Vista.
Vista is almost useless in a laptop: it's slow; it's sticky; it's a memory hog, with just the OS running, 1.8 GB of RAM is already used; and it crashed a few times. Needless to say that I switched my OS to Windows XP: it's not perfect, but it's not Vista. I had to compromise between using a stable OS and losing 1 GB of working RAM with 32-bit XP, as I don't want to switch to 64-bit XP. Once again, my ThinkPad rocks and I can actually do the things I need to do: code and surf (not in that particular order).
The experience of using Vista in my laptop was painful. And as with every other commercial transaction, when something goes wrong, I immediately wanted to blame someone for having to deal with such a crappy product. I can't blame the OS developers--after all, they are just doing their job; but I can lay blame on Microsoft, the corporation: the model of OS upgrades is very profitable when you sell the beast to massive install basis such as banks or insurance companies, and you dictate when to upgrade to the newest iteration. But when it comes to single users, it makes no sense to continue forcing them to upgrade operating systems.
For instance, everything being equal, NT 4 Workstation was a rock solid development OS, which I would still use if it were still available and supported. With time, though, I had to start using the new Microsoft operating systems because every new computer has to have an OS. With my new computer, I had the choice of Vista, I took a chance, and I regretted it.
With the blame-game fresh in my mind, I started questioning my objectivity, i.e., why do I claim that Windows Vista sucks?
To be fair, I needed to consider other underlying issues aside from the typical "Vista sucks, period!" argument. After all, I'm not a child and "just because" is not a compelling argument; it doesn't even work for Gabriel anymore, and he's only 8 years old.
True enough, I'm allowed to have an opinion of every product I buy, whether it's a positive opinion or a negative opinion; however, is the cycle of competing resources between hardware and software anything new?
By competing, I mean that software applications require more powerful hardware than previous generation of applications needed; in addition, when more powerful hardware becomes available, software apps become more power hungry; thus, diminishing the computing gains achieved with faster chip architectures.
We've seen this cycle perpetrate itself in a few fields. In the early 70s, Leigh Van Valen
started calling this occurrence the Red Queen
Matt Ridley, gives a lucid and thorough explanation of what he thinks the Red Queen means in terms of evolutionary biology1
in his book The Red Queen: Sex and The Evolution of Human Nature
. Ridley's main thesis is that sex (mixing of genes) was necessary as a survival mechanism to outrun diseases and killer viruses2
. He argues that the more diversified DNA is, the carrier of this DNA has a better chance of survival against things that can kill it, and thus extending the life of the genes.
From generation to generation, viruses try to crack the "lock" of a particular gene and the mix of generational chromosomes is better equipped to keep changing the lock; in other words, the mixing keeps outrunning killer viruses and killer viruses keep mutating to crack the locks. At some point in the evolutionary time line, both hit an equilibrium: we have countless deceases that lay dormant in our DNA that cannot be activated without the right key, and our genes don't need to mutate to eliminate these potential illnesses because they are not a threat yet.
This state of equilibrium is the Red Queen effect: "It takes all the running you can do, to keep in the same place," as Lewis Carroll's tells it in Through the Looking-Glass
Do the statements "Vista sucks" and "resources race" go hand in hand? More important, is there is a Red Queen effect at play?
In the context of operating systems and software applications, the more computing power they have and will have, as per Moore's Law
, the more power-hungry they will be. This immediately stroke me as the Red Queen at work.
In the computing industry we see this effect, in part, because of the gaming industry: game development really pushes the boundaries of commodity hardware3
: with more raw CPU power available, game developers invent new algorithms to take advantage of it; similarly, the more power game developers need, chip makers invent faster and faster CPU architectures. This is a never ending cycle. What's more, we're fortunate enough that we don't need to wait too long to see it happening: a couple of years, at the most.
This cycle of computing power availability and software applications using more and more is not in itself a problem: with more computing cycles, software applications can do incredible things, e.g., video games. It's hard to argue that more of both isn't better. The problem is that when the plain OS keeps running at the same speed CPUs are running (in the Red Queen sense), we are not able to tell where all our investment in new hardware goes: we are, essentially, doing the same with more; think of it as computing, only once, 2 + 2 in a super-computer.
It's unfortunate that operating systems take away the advantage of all the computing gains we've achieved. For example, I don't really need a 2.4 GHz machine with 4 GB of RAM and 160 GB of disk space to run a file explorer and a clock at the bottom right hand corner of my LCD display (in a 64-bit OS, nonetheless). With Windows Vista, this is exactly what happened in my laptop: Vista sucked the life out of my 2.9 lbs technological wonder.
My contention is that we are at a crossroads: either we continue running on this technological treadmill, or we completely turn things around.
I think we should stop to think about our industry for a moment, and consider this running that takes us nowhere really, really fast. For example, we've toyed with the idea of machines without an OS to just boot directly from the BIOS into a kind of browser application: the machine should be as dumb as possible, without an operating system, where the networks and countless of powerful computing farms offer us computing on demand.
What I mean is that we need to create truly on-demand computing platforms: all software should really be internet services and not fat desktop applications running on top of even fatter operating systems (Windows, OS X, or Linux). In other words, I don't really need Windows Vista or OS X to write documents or send emails; we do it today, however, because everything is bundled and have no choice in the matter.
Again, more succinctly, I'm saying that the future of computers will be thin: dumb terminals with on demand operating systems with on demand applications.
I actually thought google was working on this (Chrome may be beginnings), but they are very secretive about it (and I would hate to see ads everywhere in my apps). Microsoft, I'm sure, is already working on this problem: their business model of charging money for 100+ GB worth of operating systems and office applications will come to an end shortly (within 5 to 10 years) and they know it.
Finally, does Vista suck? I think it does. On the one hand, the Red Queen effect also took over Microsoft's engineering department: Vista sucks, but there is an excuse for its suckiness (this is nonsense, yet here we are). On the other hand, just because all that computing power is there, it doesn't mean that you should keep inventing background services to make use of it. Specially, when all those services are not really needed. For example, do I need bubbly, transparent windows? No. Do I need to be asked every time if an application can connect to the internet? No. What I do need is to see performance gains with my newer and faster hardware. I really don't see the point of paying a premium for premium hardware, if the OS itself will suffocate the life out of every electron going through every circuit in my computer just because we are all right with running faster than ever, but going nowhere.
1. If you are interested in evolutionary biology, Ridley also wrote Genome: The Autobiography of a Species in 23 Chapters
. I highly recommend his two books (Red Queen
) together with Richard Dawkin's The Selfish Gene
and The Blind Watchmaker
2. Viruses or virii? Since high school, I remember the plural for virus to be virii
. Our language has evolved, and the correct plural for virus is the now the accepted viruses
3. You can dispute the fact that the gaming industry has done more for computers than Word and Excel combined have, but I won't argue the point. I will, however, recommend you read Invisible Engines: How Software Platforms Drive Innovation and Transform Industries
, written by David S. Evans et al. to help you come up with your own conclusion. There have been other factors, but the gaming industry has inked its place in history books as being one of the main catalysts for continuous hardware upgrades. If you don't believe, ask any gamer you know.