Via The Old New Thing
, I found this video
about a "fault tolerant heap" in Windows 7. They analyzed a lot of program bugs, and found a relevant part of them being heap corruptions. As a temporary solution for the most common of these, they created a mode for running applications which uses techniques to prevent the most common of them.
On the one hand, this sounds like a very bad hack, which - from my purist point of view - should not be used at all. But on the other hand, it is a technical accomplishment. So, the criticism here does not go against the engineers from Microsoft. They probably have to face stupid programmers and stupid users every day, and just try to find ways of handling them more efficiently.
However, I consider this approach dangerous, in the sense that programmers may tend to just specify to "turn on the fault tolerant heap" (and let their setup do this) rather than programming cleanly. I can imagine this being a lot easier in many cases - and time is money. And then, in ten years, when a lot of new stupid "standard" software uses it, this will not be a feature anymore, but rather a necessity for the software to run.
I cannot really understand why this kind of bug is still an issue, anyway. There are good, fast and stable implementations of automatic memory managers out there, especially garbage collectors, which should be sufficient for most of the software (which is bloated anyway, so nobody will notice the difference). Even for macro assemblers like
C++, if you want it desperately.
I mean, there are a lot of remaining bugs which cannot be prevented that way, but that kind of bug is a solved problem. And programmers can concentrate on other stuff, and will probably be more productive when not having to think about memory management.