Baetjer does a good job (except for using a plodding style that seems contagious among Austrian economists) of arguing on a theoretic level that the software industry is suffering from the same kind of problem that centrally planned economies suffer from. But when translating from theory to practice his understanding often breaks down and he ends up advocating a much less free approach than I want.
Baetjer's description of Smalltalk sounds like it was written by a ParcPlace salesman. More importantly, it partly makes the kind of mistake that the more theoretical parts of the book warn against, of paying too much attention to the initial vision behind the design of a project, and too little to the adaptation of the project after observing interactions between it and user or other software.
I think it would have been more instructive to use Perl as an example of a good language. Smalltalk is an elegant language that looks like it was largely designed at one time before being implemented. Perl is a classic example of a bunch of unrelated ideas being hacked together as needed to produce an ugly result.
Baetjer's theory implies that languages like Perl will tend to be better than programmers expect, while languages designed like Smalltalk will not. The relative success of the languages in the marketplace clearly confirms Baetjer's theory, and leaves me wondering why someone who understands economics as well as he appears to would assume without clear evidence that the message sent by that marketplace is irrational.
I tried Smalltalk once, and gave up after writing about 30 lines because of it's limited ability to interface with the outside world. Specifically, I had trouble reading some files I had that contained integers in 16 bit binary form, and the output capabilities of the implementation I used were lousy (I probably had a poor implementation - it had no GUI, and was missing the allegedly powerful Smalltalk environment, possibly because that environment wasn't as portable (i.e. adaptable to new market conditions) as a language should be).
In contrast, Perl fills what used to be an important gap in the set of tools I need to build software, and it improves my productivity as long as I resist the temptation to write large (300+ line) programs in it, even though Perl is such an erratically designed language that after writing maybe 30,000 lines of code in it, I probably don't have a clue about how something like 30 or 40 percent of the language features work. For comparison, by the time I had written that many lines of C++ (not exactly the world's simplest language), I could at least fool myself into thinking I understood the language well enough to implement compiler for it with little need to consult references.
As with planned versus evolved human languages (e.g. Loglan vs. English), or centrally planned cities versus cities which result from decentralized decisions, there's a strong temptation with programming languages to see the elegance of design, and to overlook the merits of ad-hoc adaptations.
Nothing in the argument above is intended to draw any conclusions about the overall merit of any language (Perl is clearly optimized for different purposes than Smalltalk), merely to compare their actual value to what people expect. Although it does seem a bit ironic that Perl is better at writing small programs and servers that often amount to the small components that Baetjer complains are missing from the software industry, whereas the few Smalltalk applications I recall hearing of sounded like large, probably monolithic systems. Baetjer seems to overlook Perl's value for creating components because it's an adaptation that doesn't fit the vision that some people have been planning for the software industry (and it's no accident that I'm phrasing this the way I would phrase an attack on a socialist).
Baetjer points out that code generation tools work when used as a way to replace low level programming with high level programming, but not when they try to become a substitute for programming. I wish he had applied this kind of insight to his discussion of prototyping. The "throw away the first implementation" approach to prototyping seems to reject the Austrian insights about the importance of implicit knowledge (which I expect often gets embedded in prototypes). An alternative I consider promising is the one promoted by the Python programming language - start by writing everything in a high level language, and later rewrite some of the most cpu-intensive components in C for speed without ever needing to rewrite the components which worked well in the initial phase.
Baetjer's argument for superdistribution ("The single most important problem the software industry faces today is its lack of markets for working capital - reusable software components." - page 143) ignores the existence of open source software, and his solution appears to require either that the software components be closed source and run only on closed source operating systems (at least for significant parts of the code), or that users be trusted to allow payments which they could disable by modifying the relevant code.
I've become dependent enough on the benefits of an open source operating system that I wouldn't give it up without a more compelling case than Baetjer thinks he has presented.
In fact, it isn't clear that there's any shortage of rewards for building reusable software components or much disinterest that could be alleviated by such rewards. The open source phenomenon has demonstrated the existence of substantial energy being directed towards improving the quality of software, and eliminates some of the obstacles that Baetjer assumes are hindering the development of reusable components (e.g. the scarcity of space on store shelves), yet the trend towards reusable components there has been quite modest.
Are open source programmers still producing inferior code because their incentives are weighted too much towards immediate projects? Or are they doing the best they can, and disappointing because theorist expectations were too high (I'm unconvinced that the software industry is less specialized than other industries of comparable maturity) and because standards which balance well the need for rich data types with the need for simplicity are hard too create?
Does having proprietary code create important disincentives to standardization? My experience says probably yes. The costs to an open source programmer of switching from contributing to a project adapted to a failing standard to contributing a project that is setting the dominant standard are often close to zero. In contrast, the equivalent switch in commercial systems typically requires programmers to switch employers. Even with self-employed programmers, the incentives that Baetjer hopes will promote investment directed towards reusability will create some disincentives to switching to the winning standard.
And the "not invented here" syndrome seems to be reinforced by proprietary approaches. I can't adequately explain this, but I'm convinced I am more likely to adopt that syndrome while working on a commercial project than when I'm writing freeware.
It will be expensive to answer these questions. I hope there will be some experimentation with open-source systems that encourage users to allow superdistribution-style payments for software component use, but I suspect that this will be no more successful than the shareware approach has been.
Economists are focusing a lot of energy on battling the tragedy of the commons because that has proven important in the past, but open source software has provided a clear case where the commons creates benefits that are hard for propertarian alternatives to compete with. There is no reason to stick with a capitalist approach when a freer approach has been proven to work.
The URL of this document is http://www.rahul.net/pcm/baetjer_review.html