When larger, faster volumes began coming out a few years ago, part of the value proposition forwarded at the time was that because of their speed and capacity they would not require defragmenting. This was due in equal parts to their capacity, speed and on-board caching.
Several years down the road, real-world experience has taught us otherwise. Quite the contrary, as many sites have discovered, defragmentation is needed more than ever. Why?
First, such a broad, sweeping generality should always be suspect. Years ago (for a seasonal example outside of computing) when microwave ovens first came on the market, there were claims that that a full-size turkey could be cooked in an hour and conventional ovens would never be needed again. If you’ve ever made the mistake of trying to microwave a turkey, you know how laughably wrong this particular bit of promotion was.
Second, a lesson that should have been applied from physics, was not: an empty space is a vacuum. If there is space to fill up, it will become full. This has certainly proven true with large-capacity drives, especially with today’s much larger file sizes–video, sound and multi-media among them.
Third–and it’s hard to believe this was not taken into account–server disks will not only fill up quickly no matter the size, but access to them is constant. Those two factors, plus the vital necessity of fast access to support business, make defragmentation of such volumes a must.