Thursday, August 9, 2007
Microsoft DRM code for Netflix streams hacked
"Macworld has posted a story by IDN News Service about a hacker who posted instructions (http://www.macworld.com/news/2007/08/09/netflix/index.php) for saving streaming movies from Netflix, defeating Microsoft's DRM code designed to prevent users from saving the content.
Friday, July 20, 2007
Google's $4.6 billion plan for an open wireless Internet
Good news if it happens: http://machinist.salon.com/blog/2007/07/20/google_fcc/
Tuesday, July 17, 2007
On Brushing One's Teeth
One of my all time favorite articles:
On Brushing One's Teeth
By Gregory V. Wilson, Dr. Dobb's Journal Jul 22, 2001 URL:http://www.ddj.com/dept/architect/184410460
Greg is the author of Practical Parallel Programming (MIT
Press, 1995), and coeditor with Paul Lu of Parallel Programming
Using C++ (MIT Press, 1996). Greg can be reached at
gvwilson@interlog.com.
Have you ever wondered why people live longer than they used to?
It's not because of heart transplants and CAT scanners, or radiotherapy, or fiber optic cameras that let doctors take pictures of people's insides. In fact, sanitation and nutrition account for about three quarters of our increased longevity, with antibiotics and the invasive surgery made possible by anaesthesia responsible for most of the rest. As any dentist will tell you, it all comes down to brushing your teeth after meals.
Studies have similarly shown that working practices make a bigger difference to software development than any tool developed since interactive debuggers and version control systems. GUI builders, regression testing frameworks, CASE design aids, and fault tracking systems don't improve programmer productivity and software quality; the working practices to which they are adjuncts do.
It is therefore puzzling that almost all software development is still done ad hoc, and that good practice is routinely ignored. We start coding before we have thought through a design, or scribble down a design before finding out what users actually want, or test new
classes only after incorporating them into the final product. It's as if we all knew about cavities, but still refused to floss.
Given that we programmers are, by and large, fairly intelligent people, why are we so reluctant to do what we know is right? We would accomplish more, with less strain on our personal lives, and we would be able to take some pride in the things we build. One possibility is that we collectively suffer from a mild form of Pygmalion's frenzy. Legend says that Pygmalion created a statue that was so perfect that it came to life, but that he died of exhaustion as a result of his
hard work.
Programmers are taught -- by example, if nothing else -- that programming is about typing in and debugging code. Most of us (myself included) don't feel like we're really working unless our fingers are on a keyboard and the compiler is humming away. Few programmers really care about "lusers": technical prowess, the admiration of our peers, and mastery over the monsters we have created is more satisfying than methodical craftsmanship. If this is true, then programmers don't work well because working well actually takes time away from the thing they enjoy most: programming.
Another possibility is that we don't work well because we don't really need to. There's an old story about two hunters being chased by a bear. The first hunter says, "Why are we running? We can't run faster than a bear," and the second replies, "I don't need to run faster than the bear, I just need to run faster than you." Time to market is so important in the software industry that throwing down some code, any code, is sometimes the only way to meet the deadlines that we (or our marketing departments) have set. Market conditions and products change so quickly that code is more likely to be thrown away and replaced than modified or upgraded, so what's the point of writing it well? Of course, this is a chicken-and-egg situation: programmers throw code away because it wasn't built well, and then, because they have to start from scratch, churn out code without building it well. We keep getting away with this because we have trained our market to have very low expectations. So long as market leaders like Microsoft are able to demonstrate that your products don't have to be well-designed in order to be extravagantly successful, what incentive is there for the rest of us to do better?
While all of these explanations are part of the picture, I think that the most likely reason for bad practice is ignorance. Most people, myself included, have only a hazy notion of what good working practices actually look like. This isn't because we didn't do a software engineering course as part of our degree, or because we forgot it all once the course was over. It isn't even because most software engineering education focuses on large project and time scales, which students can't relate to -- there are now books, like McConnell's Rapid Development, which focus on small teams (4 people) and medium timescales (12 months), so that students can put
what they're being taught into a personally meaningful context. No, I think the real reason that most of us don't know good practice is simply that we have never seen it applied in real life.
Few academics follow good practice, primarily because they don't need to: they're always in rapid prototyping mode, trying to get the next paper cranked out, and rarely have to care if their code is robust or maintainable. Most of us are trained on "projects" that consist of one-week throw-away assignments, done in isolation. As a colleague pointed out, every project she has worked on since leaving university has been bigger than every project she worked on during her undergraduate degree. (She also pointed out that I recently undertook a major re-architecture of a product three weeks before its supposed ship date -- which slipped -- despite all the books I've read.)
Sadly, I don't see any real prospect for change. I keep thinking that software is becoming so complicated that it will have to be designed right. Building a GUI is hard -- until someone invents a WYSIWYG toolkit like Visual Basic that lets programmers continue to be sloppy. Concurrency is even harder -- so most programmers ignore it, or only use it in a couple of very conservative, well-understood ways. Plug-in component technology like COM is the hardest of all -- and yet people seem to be managing, if you call the time wasted with screwed-up DLLs and other bit rot "managing".
I also think that teaching institutions won't change because they don't need to. The goal of most academics is publication, not robust software. Most of the people who teach programming have little or no large-scale experience, and let's face it, group projects and multi-step projects are hard.
I think change will only come when software starts killing people -- not just one or two, but dozens. Mechanical engineering didn't become a discipline until a lot of people were killed by
exploding steam boilers. Attempts here in Canada to set up degrees in software engineering have been blocked by professional engineering societies, which have argued that software development is nowhere near rigorous enough to merit the term 'engineering'. But sooner or later, a bug in a car's braking system or a flipped bit in an airplane's flap controller is going to kill a couple of hundred people. In the tidal wave of lawsuits that follow, somebody will finally force us all to start brushing our teeth.
On Brushing One's Teeth
By Gregory V. Wilson, Dr. Dobb's Journal Jul 22, 2001 URL:http://www.ddj.com/dept/architect/184410460
Greg is the author of Practical Parallel Programming (MIT
Press, 1995), and coeditor with Paul Lu of Parallel Programming
Using C++ (MIT Press, 1996). Greg can be reached at
gvwilson@interlog.com.
Have you ever wondered why people live longer than they used to?
It's not because of heart transplants and CAT scanners, or radiotherapy, or fiber optic cameras that let doctors take pictures of people's insides. In fact, sanitation and nutrition account for about three quarters of our increased longevity, with antibiotics and the invasive surgery made possible by anaesthesia responsible for most of the rest. As any dentist will tell you, it all comes down to brushing your teeth after meals.
Studies have similarly shown that working practices make a bigger difference to software development than any tool developed since interactive debuggers and version control systems. GUI builders, regression testing frameworks, CASE design aids, and fault tracking systems don't improve programmer productivity and software quality; the working practices to which they are adjuncts do.
It is therefore puzzling that almost all software development is still done ad hoc, and that good practice is routinely ignored. We start coding before we have thought through a design, or scribble down a design before finding out what users actually want, or test new
classes only after incorporating them into the final product. It's as if we all knew about cavities, but still refused to floss.
Given that we programmers are, by and large, fairly intelligent people, why are we so reluctant to do what we know is right? We would accomplish more, with less strain on our personal lives, and we would be able to take some pride in the things we build. One possibility is that we collectively suffer from a mild form of Pygmalion's frenzy. Legend says that Pygmalion created a statue that was so perfect that it came to life, but that he died of exhaustion as a result of his
hard work.
Programmers are taught -- by example, if nothing else -- that programming is about typing in and debugging code. Most of us (myself included) don't feel like we're really working unless our fingers are on a keyboard and the compiler is humming away. Few programmers really care about "lusers": technical prowess, the admiration of our peers, and mastery over the monsters we have created is more satisfying than methodical craftsmanship. If this is true, then programmers don't work well because working well actually takes time away from the thing they enjoy most: programming.
Another possibility is that we don't work well because we don't really need to. There's an old story about two hunters being chased by a bear. The first hunter says, "Why are we running? We can't run faster than a bear," and the second replies, "I don't need to run faster than the bear, I just need to run faster than you." Time to market is so important in the software industry that throwing down some code, any code, is sometimes the only way to meet the deadlines that we (or our marketing departments) have set. Market conditions and products change so quickly that code is more likely to be thrown away and replaced than modified or upgraded, so what's the point of writing it well? Of course, this is a chicken-and-egg situation: programmers throw code away because it wasn't built well, and then, because they have to start from scratch, churn out code without building it well. We keep getting away with this because we have trained our market to have very low expectations. So long as market leaders like Microsoft are able to demonstrate that your products don't have to be well-designed in order to be extravagantly successful, what incentive is there for the rest of us to do better?
While all of these explanations are part of the picture, I think that the most likely reason for bad practice is ignorance. Most people, myself included, have only a hazy notion of what good working practices actually look like. This isn't because we didn't do a software engineering course as part of our degree, or because we forgot it all once the course was over. It isn't even because most software engineering education focuses on large project and time scales, which students can't relate to -- there are now books, like McConnell's Rapid Development, which focus on small teams (4 people) and medium timescales (12 months), so that students can put
what they're being taught into a personally meaningful context. No, I think the real reason that most of us don't know good practice is simply that we have never seen it applied in real life.
Few academics follow good practice, primarily because they don't need to: they're always in rapid prototyping mode, trying to get the next paper cranked out, and rarely have to care if their code is robust or maintainable. Most of us are trained on "projects" that consist of one-week throw-away assignments, done in isolation. As a colleague pointed out, every project she has worked on since leaving university has been bigger than every project she worked on during her undergraduate degree. (She also pointed out that I recently undertook a major re-architecture of a product three weeks before its supposed ship date -- which slipped -- despite all the books I've read.)
Sadly, I don't see any real prospect for change. I keep thinking that software is becoming so complicated that it will have to be designed right. Building a GUI is hard -- until someone invents a WYSIWYG toolkit like Visual Basic that lets programmers continue to be sloppy. Concurrency is even harder -- so most programmers ignore it, or only use it in a couple of very conservative, well-understood ways. Plug-in component technology like COM is the hardest of all -- and yet people seem to be managing, if you call the time wasted with screwed-up DLLs and other bit rot "managing".
I also think that teaching institutions won't change because they don't need to. The goal of most academics is publication, not robust software. Most of the people who teach programming have little or no large-scale experience, and let's face it, group projects and multi-step projects are hard.
I think change will only come when software starts killing people -- not just one or two, but dozens. Mechanical engineering didn't become a discipline until a lot of people were killed by
exploding steam boilers. Attempts here in Canada to set up degrees in software engineering have been blocked by professional engineering societies, which have argued that software development is nowhere near rigorous enough to merit the term 'engineering'. But sooner or later, a bug in a car's braking system or a flipped bit in an airplane's flap controller is going to kill a couple of hundred people. In the tidal wave of lawsuits that follow, somebody will finally force us all to start brushing our teeth.
Subscribe to:
Posts (Atom)