File fragmentation–the splitting of files into multiple pieces (fragments) in order to save more files on a hard drive–has always been a problem. Even back when file sizes were normally in the 1 to 2 kilobyte range, files split into tens or hundreds of fragments were not uncommon at all. Because of the extra I/O traffic it took to access such files, computer performance could be brought to its knees, and defragmenters quickly hit the market right from the beginning to address the problem. But today, because file sizes have grown unbelievably, so has file fragmentation–and it’s more serious than ever.
Today a file size of 1 to 2 kilobytes is highly unusual. The advent of common graphic, sound and video files means that file sizes are now normally measured in megabytes (1 megabyte is approximately 1,000 kilobytes), and even common application files which were always considered small have grown markedly. Microsoft Word files, which used to be mainly text, now usually include graphics and photos. Microsoft Excel files, which used to consist only of figures and calculations, can now include graphics and photos as well. Microsoft Power Point presentations used to consist of graphics and text but now include sound and video files. And the number of fragments per file has grown along with the average file size–today, it is not unusual to find files split into tens of thousands or even hundreds of thousands of fragments.
Back when file sizes were smaller, defragmenter solutions became available which could be scheduled to run during times when users weren’t on a computer. At the time, they worked well; for example, a defragmenter could be set to run at night and in the morning defragmentation would be complete and performance would be restored. Today, however, scheduled defragmentation is no longer keeping up with the rate and scope of fragmentation; in between scheduled runs fragmentation is continuing to build and impact performance. In some cases, as with today’s very large terabyte drives, fragmentation is not being addressed at all.
Additionally, there are many cases–as with web servers, for example–that computers are accessed 24X7 and have no “downtime” during which a scheduled defragmentation run can occur.
Because of today’s file sizes and the resulting fragmentation, the only true solution is one that is completely automatic and runs invisibly in the background. Instead of having to defrag on a schedule, it defragments whenever idle resources are available. User performance is never negatively impacted and no scheduling is ever needed.
Today’s file sizes mean serious fragmentation–and fortunately, there is now a serious solution for it.