On Microsoft Secrets
by Michael Cusumano and Richard Selby

This book is the result of an analysis of software development practice within several of Microsoft's teams. It found that they were pretty consistent across projects, not through edict but because each wanted to use the good ideas that others have developed.

The overall impression is of a system that's less refined than ours in many ways. It's partially due to the continual reorganization that goes on at MS, and the inexperience of the lower-level developers constantly being brought in. MS hiring places such a premium on native intelligence to the exclusion of experience that every project has to make many of the same mistakes itself before it gets good at developing product.

An attitude evinced by some projects is that volume is everything in software. That is, huge numbers of sales are very important. Duh.

Another ideal is to set standards rather than follow them. For instance, with Visual C++ MS decided to change the battleground from providing a good compiler and tools (where they'd already conclusively lost the C++ development tools battle), to a package for making a complex language easy for a regular programmer to develop with. They succeeded, and Visual C++ came from behind to become the market leader.

MS applications teams spend a fair amount of effort measuring "customer activities". In the standard literature this is referred to as "use cases". The idea is to measure how the customer actually uses the product in order to make it easier to use.

MS sets dates and goals for projects without any basis in fact. They know they can't actually predict the schedule, but they try anyway. The schedule really exists mainly to flog the workers into working really hard.

MS's sccs system is similar to our SCS, but less advanced. They force someone to be the build integrator each day, rebuilding any changed code. Where we force every check-in to build, they force each day's worth of check-ins to build. One feature they have that we might consider is an automated smoke test that developers can run to see if the code works. We tried this once before at KA and it didn't work out, but I think we were trying for a more complete test. Some simpler test-the-major-features sort of test would still be handy; we'll have to think about whether use of IC might make this obsolete for us.

An interesting idea was that of "testing buddies". Each developer knows the tester who is testing his/her code. They work together, with the developer telling the tester what things are particularly worth checking. This is the system Lee and I used with Lotus, and it was handy. In general though, MS has a way too huge focus on testing, their code is totally hacked for deadlines and then they try to test their way out of it. Dark ages.

It's amusing that MS regards one of the secrets of their success to be the use of all-in-one-place teams. This is a major difference between them and us. I think it's another example of MS arrogance: just because they've got great business sense, they think they've actually got a worthwhile development process too! Their reasoning is that they have to have meetings all the time to straighten out all the things that go wrong in a project, quickly. The solution I would use is to prevent all those things from going wrong, but hey that's just my opinion. There's certainly no doubt that the go-to-the-office approach is much harder on their teams, especially given how hard they're made to work. Burnout is incredibly high, basically expected.

A phrase MS uses is "eat your own dogfood". It means that people within the company have to start using MS products even as they're in development, to force them to improve the really bad features. Enforced alpha testing. One aspect of "eat your own dogfood" is important for us too, I think: we should be running the software we produce on lower-level machines instead of the faster Pentiums we're used to. Maybe we have to have more memory to run the development software, but I think we'd get by just fine with slower CPUs. And it would result in us being far more aware of performance problems in software. Now that I think of it though, I guess we should get the same effect with some sort of slow-down-factor built into the program during development; no need to hobble the machine to slow the thing down.

Substantive changes:
    March 3, 1996: created.
Copyright © 1996, Steve Colwell, All Rights Reserved
Home page