Sunday, July 29, 2007

Hasta la Vista, part 1: Microsofts final death march

Poor Bill Gates. The world will remember him as the richest man in the world and the founder of Microsoft, but he will never achieve the same statue as his arch rivals Steve Jobs and Steve Wozniak (aka "the Woz").

The younger generation may not know the latter, but Steve Wozniak was the technical wizard that put Apple on the map. He only needed six chips to control a floppy disk, where others needed twenty. With the words: "It was only two chips. I didn't know if people would use it." he added hi-res graphics to the Apple ][. BTW, he also designed the interpreter Calvin. Needless to say, this man gets the utmost respect from us nerds.

Unfortunately, Bill Gates is not a wizard. Even worse, he is a bad programmer. When Martin Eller, a Microsoft programmer, found an error in the flood fill routine of the MS-Basic interpreter, he exclaimed "Which moron wrote this brainless sh*t?" only to find out it was Gates himself who wrote the "brainless sh*t". I think it is safe to say that Bill Gates is hardly the technical wizard he would so much like to be.

The keyword of Steve Jobs life is "next". Steve always knew what was coming next. It was no surprise to me that when he founded a new company and consequently built a computer, he called them both "NEXT". Steve may not have invented everything himself, but he always seemed to know what was "hot" and what was not. In the seventies, the microcomputer was hot. In the eighties, the graphical user interface was hot. In the nineties, Unix was hot. In the beginning of the next millennium, digital music was hot. Even if you don't like Steve Jobs at all, you can't say he doesn't have a keen eye for trends.

Bill Gates track record is pretty bleak, compared to Steve Jobs. While Jobs and Wozniak were building their Apples, Bill Gates was writing punched paper roll Basic programs for the flopped Altair microcomputer in a shabby motel in Albuquerque. When the world was clicking away on its Mac, Microsoft brought you MS-DOS 3.3. It didn't even fully use the Intel 80286 microprocessors of the time. When Martin Eller said to Gates that he should not ignore the small bandwidth available in 1995, Gates had no idea what he was talking about. "Er.. er.." was all he could utter. Again, Gates has good business instincts, but he is no visionary. He may have written "The road ahead", but unfortunately for him the rest of the world headed in a completely different direction, hence a second edition to correct a few errors Mr. Gates made (like completely ignoring the Internet). But what do you expect from somebody who thought you would never need more than 640 KB memory?

One notorious proof that Bill Gates has it wrong time and time again, are the "death marches". According to Cinepad (which hosts an entire MS-vocabulary), a death march is: "The long, lingering final countdown to a ship date, involving 16-25-hour days, catnaps on couches, and plenty of 'flat food' (food, mostly from vending machines, that you can slip under people's doors so they can keep working)". In 2001 Microsoft made a documentary film celebrating the creation of Windows XP. Allchin, vice president of Microsoft, previewed the film and ordered it to be burned. Filming at the Microsoft campus is like filming in a slaughterhouse. You may like the meat, but you don't have to know how it is made. Death marches simply aren't pretty.

In effect, death marches are a necessary evil for Microsoft to catch up and keep in business. The first death march was in 1984, when Microsoft desperately tried to keep up with the revolutionary Macintosh of Apple. And don't get me wrong, some of Microsofts programmers are pretty smart. The first and never released version of Windows used the technically superior "pre-emptive multitasking". Bill Gates didn't have a clue of what that meant. He just wanted a Macintosh clone and he wanted it now. He didn't want proportional scroll bars, because the Mac didn't have them. He didn't want drag-and-drop functionality because the Mac didn't have it. Can you image what Windows 1.0 could have been if the team had had their way? But they dumped all the code they had written so far and used "cooperative multitasking" instead, which has been responsible for millions of computer crashes and freezes. It set the team back for a year.

Windows 1.0 was too little, too late. Macintosh had set the standard for GUI based desktops. Consequently, Windows 2.0 was a flop. So was Windows/286. Meanwhile, everybody was waiting for the coming of OS/2. Then came Windows 3.0. And it came big time. It was a hit. We would never hear from OS/2 again.

What few people knew is that Windows 3.0 was a very ugly hack, put together by David Weise. What he did was basically very simple: he ran Windows in a debugger to find out which parts didn't run in protected mode and then fixed it – line by line. At that time the Windows code was already hundreds of thousands of lines. In the end, he had done it. The first time it "ran", it crashed. This is what Microsoft calls a "Zero Bug Release" (not, as you might suspect, a version of a software product that's error-free, but (in an Orwellian twist) a release with the major bugs eliminated, retaining plenty of less significant problems).

Windows seemed unstoppable, especially when Windows 3.1 and Windows 3.11 emerged. Win32s was a library that enabled 32 bit programs to run under these GUI shells, but Microsoft was still far from a 32 bit Operating System. Fortunately Dave Cutler and his team were in for something new. It was obvious that he didn't want to make a PC Operating System, they were in for something far bigger. Something that could take Unix – Cutlers eternal nemesis – head on. In short, he rebuilt VMS. There are several articles on the Internet on the technical similarities between VMS and WNT. And if you don't believe that one, shift the letters that make up WNT one position to the left. Well, in short Microsoft paid Digital Equipment $150 million in compensation for using portions of an old Digital OS in WNT.

Cutler would let nothing stand in the way of realizing his design and often clashed with his programmers, senior Microsoft management, and even Gates himself. Gates needed a vehicle that would further Microsoft's marketing strategies, rather than a robust OS. The success of Windows made Microsoft change its strategy, so the NT programmers were forced to upgrade the 16 bits Windows API to what is now called the Win32 API instead of making a clean and fresh 32 bits API. Of course much of the eventual coding on NT was done by Microsoft engineers, so in the end the quality of NT's final code wasn't even in the same league as VMS.

It is no accident that Microsoft coding has such a bad rep. The choices that are made at Microsoft are deliberate. A few examples. Jon Ross accidentally left a bug in SimCity for Windows 3.0 where he read memory that he had just freed. It worked fine on Windows 3.x. On beta versions of Windows 95, SimCity wasn't working in testing. Microsoft tracked down the bug and added specific code to Windows 95 that looks for SimCity. If it finds SimCity running, it runs the memory allocator in a special mode that doesn't free memory right away. Note that they probably didn't make those provisions just for SimCity, but for other programs too. Why? Simply because the merchantability of Windows is more important than technical excellence and a clean design. You can imagine how much room that leaves for bugs and malicious code.

Microsoft is also known for its bloatware. That is intentional too. Joel Spolsky, a former Microsoft employee explains: "In 1993, given the cost of hard drives in those days, Microsoft Excel 5.0 took up about $36 worth of hard drive space. In 2000, given the cost of hard drives in 2000, Microsoft Excel 2000 takes up about $1.03 in hard drive space. (..) In fact there are lots of great reasons for bloatware. For one, if programmers don't have to worry about how large their code is, they can ship it sooner. And that means you get more features, and features make your life better (when you use them) and don't usually hurt (when you don't). If your software vendor stops, before shipping, and spends two months squeezing the code down to make it 50% smaller, the net benefit to you is going to be imperceptible. Maybe, just maybe, if you tend to keep your hard drive full, that's one more Duran Duran MP3 you can download. But the loss to you of waiting an extra two months for the new version is perceptible, and the loss to the software company that has to give up two months of sales is even worse".

Of course, this way of working simply couldn't continue. This had to go wrong one time. I don't mean the viruses, the spyware or the overall security (or lack of it). Microsoft tolerates the bugs riddling the software, since problems can always be patched over. However, with each patch and enhancement, it becomes harder to strap new features onto the software, since new code can affect everything else in unpredictable ways. In short, the software becomes unmaintainable. And Vista reached that point.

In an article, originally featured at (which has mysteriously disappeared a short while later) and the (paid) online version of Wall Street Journal, David Richards describes what happened. Jim Allchin personally broke the bad news to Bill Gates. "It's not going to work," he told Gates in the chairman's office. "Vista is so complex its writers will never be able to make it run properly". He showed Gates a map of how Windows' pieces fit together. It was 2.75 meters tall and 3.75 meters wide and looked like a haphazard train map with hundreds of tracks crisscrossing each other. Of course, Windows could be designed so that Microsoft could easily plug in or pull out new features without disrupting the whole system, but it would have to throw out years of computer code and start out with a fresh base. Vista would have to be simple. Yeah, right.. And pigs do fly!

In the next part of this series, I will focus on the problems that plagued Vista later on and some rather disturbing details around Windows security.


Barbarians led by Bill Gates
- 1998, Jennifer Edstrom/Marlin Eller

No comments: