Home
| Calendar
| Mail Lists
| List Archives
| Desktop SIG
| Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU |
Rich Braun wrote: > Ever since then, it's all been downhill. The fastest computer I ever > used, in terms of how much it actually got done for me, was a CDC > Cyber 173 installed in--get this--summer 1978. > > Bloatware seems to be killing everything. Coincidentally I ran across mention of Wirth's law yesterday after reading your message. http://en.wikipedia.org/wiki/Wirth%27s_law Software gets slower faster than hardware gets faster. or Software is decelerating faster than hardware is accelerating. Here's mention of it in the context of Moore's Law: http://en.wikipedia.org/wiki/Moore%27s_law Another, sometimes misunderstood, point is that exponentially improved hardware does not necessarily imply exponentially improved software to go with it. The productivity of software developers most assuredly does not increase exponentially with the improvement in hardware, but by most measures has increased only slowly and fitfully over the decades. Software tends to get larger and more complicated over time, and Wirth's law even states that "Software gets slower faster than hardware gets faster". More from: http://en.wikipedia.org/wiki/Wirth%27s_law Programs tend to get bigger and more complicated over time, and sometimes programmers even rely on Moore's law to justify writing slow code, thinking that it won't be a problem because the hardware will get faster anyway. As an example of Wirth's law, one can observe that booting a modern PC with a modern operating system usually takes longer than it did to boot a PC five or ten years ago. Software is being used to create software. By using Integrated Development Environments, Compilers and Code Libraries, programmers are further and further divorced from the machine and closer and closer to the software user's needs. With a possible trend towards "meta" programming, like Intentional Programming[1], where the programmer only deals with high-level concepts and the tools generate the low-level code, the performance situation will likely get worse. 1. http://en.wikipedia.org/wiki/Intentional_programming > The open-source movement is pretty much by definition oriented toward > bloat: contributions keep coming in and adding to the code pile. Others have already commented on this, but just as with commercial software, the degree of bloat is dependent on how well the project is managed. Some projects have a very good definition for what is in-scope, and reject contributions that don't fit. One very well known open source project is the Linux kernel. Not exactly known for being without bloat, but according what I've read, Linus and the others working on it seem to be constantly tossing out obsolete portions and kicking stuff out to "user space." -Tom -- Tom Metro Venture Logic, Newton, MA, USA "Enterprise solutions through open source." Professional Profile: http://tmetro.venturelogic.com/ -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.
BLU is a member of BostonUserGroups | |
We also thank MIT for the use of their facilities. |