I ran across an article recently comparing the performance of a 1986 Mac and a 2007 PC. Of course the new machine would totally blow away the old one on a number crunching benchmark, but when it comes to the most mundane benchmarks — time to boot, launch Microsoft Word, open a file, do a search and replace, etc. — the old Mac pulls ahead slightly. Software bloat has increased at roughly the same rate as Moore’s law, making a new machine with new software no better than an old machine with old software in some respects.
The comparisons in the article resonate with my experience. I expect administrative tasks to be quick and number crunching to be slow, and so I’m continually surprised how long routine tasks take and how quickly numerical software runs.
Update (December 2014): Processors have not gotten much faster in the time since this post was first published, though we have more of them. However, applications do open faster. Perhaps software vendors have paid attention to bloat precisely because the free lunch of faster processors is over.
Here’s a contrary view, based upon the economics of storage. Excel today is much cheaper than it was in 1993: http://www.joelonsoftware.com/articles/fog0000000020.html
I have also noticed this and it is very irritating.
But there is hope my new MBP with an SSD starts fast and ticks all these boxes.
There is a balance between robustness and speed. The old systems were far less reliable. Mac OS circa 1986 would freeze hard, something that hardly ever happens these days.
Windows 7 is orders of magnitude more robust than Windows 3.1.
And don’t even get me started on software that automatically checks for updates every time I start my computer….
I’ll accept my share of the bloat blame: I write new code all day, and when the new software ships, it’s bigger than it was before.
Please don’t shoot me for mentioning the “L” word, but having been Microsoft-free for 4 years (I use a minimalist Linux distro called “Puppy Linux”), I would encourage anyone to at LEAST use Linux part-time (as in dual boot).
My present primary computer is a 12-year-old (I think) generic clone with an 800 mhz processor and 512mb RAM. No, I don’t use it for work or production use (I’m disabled), but Puppy makes this old dog run like a Greyhound! I’ve never gone beyond booting it off a liveCD (it picks up key files from the hard drives and finishes the boot from there–in this configuration it generally runs in RAM, which makes a lot of difference), but it has served me quite well–and it boots every bit as fast or faster than any version of Windows I’ve used (especially the later, more bloated versions). I would encourage you to at least consider some lightweight version of Linux in a dual-boot configuration (Puppy is SUPER-easy to start with) as a part-time alternative. I’m not into the heavy-sell on Linux; I think it sells itself. And for many tasks (not all) you may find yourself wondering why you’ve been doing everything in MS Windows.
“Update (December 2014): Processors have not gotten faster in the time since this post was first published”
I find this to be a weird statement; of course processors have gotten faster. And I’m not talking about multicore processing, but single-threaded performance too. Even the efficiency and Instructions Per Clockcycle (IPC rate) have been increasing.
Because optimizing startup speed is in the bottom of priority list compared to adding features that sell the software. As long as the software starts in an expected time it’s good enough, thus it naturally follows Moore’s law.
I recall an interview of Andy Grove (may have been by Charlie Rose) where he summarized the phenomenon/dynamic/state-of-affairs thusly: “Intel giveth and Microsoft taketh away”.