It’s a staggering thought that just over 20 years ago, a standard corporate server had a 100-Megabytes hard drive and an astonishing 8-Megabytes of memory. Many times the system drive and data drive were one and the same unless the system manager was somewhat progressive–hard drives were too expensive to add frivolously. And the discovery that file fragmentation heavily impacted performance was so new that one major hardware manufacturer, Digital Equipment Corporation (DEC), officially denied it was a problem.
System managers were far ahead of the “official denial” however. They spent nights and weekends handling fragmentation by the only method known: backing-up and restoring their drives. Backup and restore was a long, arduous task and the techie community clamored for a solution to fragmentation.
Enterprising software developers listened and several rushed defragmenters to the market. They were unsafe, corrupted data, and created a battle for other developers who arrived a short time later with safe defrag programs that actually did the job. But the battle was short–when system managers found that there were safe defragmenters, sales soared. Even though the first defragmenters were run manually, it was still better than backup and restore.
Development progressed, and scheduled defragmenters arrived. Alleviating overworked Sys-Admins and allowing them to enjoy their nights and have work-free weekends!
While defrag technology made advances, scheduling remained the standard. As PCs replaced mainframes, and disks became cheaper, the number of disks proliferated. Instead of employees with terminals that connected to one computer, they each had their own PC with its own drive. Now Sys-Admins had to schedule defragmentation on each computer, creating far more work.
Today’s drives have over 100-times the capacity of those 20 years ago and not only does each user have their own computer they have one or more drives. Where does that leave fragmentation and its solutions?
Fragmentation itself occurs more than ever before. It’s common for files to be in hundreds or thousands of fragments, slowing performance to a crawl. Scheduling defragmentation is difficult and outdated.