Connect with us

Hi, what are you looking for?

New Products

Datacenter Automation Takes Another Leap Forward. But What About Defragmentation?

Automation in data centers has hit new highs in the last few years. HP, IBM and other major vendors offer suites which make many aspects of system management fully automatic. A concept introduced by IBM called Autonomic Computing takes it all even further and introduces technologies which actually manage the entire operation of a datacenter, not just parts of it. The system adjusts resources according to current and anticipated workload, implements technology solutions from a stated policy, merges new branch office systems, and much more. With this level of automation being achieved or at least approached, it seems almost silly that a lower-level function such as defragmentation would not be fully automatic.

Part of the reasoning for technologies such as Autonomic Computing is that there are now so many functions and so many tasks within a data center that the human staff cannot possibly keep pace. Add to that problem the current shortage of trained and experienced IT personnel, and the need for such solutions becomes painfully obvious.

Why, then, would datacenters continue to schedule defragmentation? For many years, defragmentation has been a chore basic to maintaining computer performance and reliability. Yet even in this age of total automation, IT staffs are still burdened with the time it takes to analyze an ever-changing set of disk traffic factors within a system, then schedule defragmentation runs so that disk access is assured. These defragmentation runs also have to be scheduled in such a way that the performance hit from the defragmentation runs does not impact users–a neat trick in this time of expanding 24X7 access.

Aside from the precious time it takes to administer scheduled defragmentation, the technology itself has become outmoded due to the constant and frantic rate of fragmentation today. Advanced file, disk, server and data security technologies, as well as enormous file sizes and disk capacities and ever-increasing access have meant that fragmentation continues to impact performance in between scheduled defragmentation runs. Additionally, the defragmentation method employed by these solutions is not, in many cases, even getting a handle on fragmentation.

A defragmentation solution is needed which puts it on a par with such advances as Autonomic Computing: one which is completely automatic and transparent, requires no scheduling, has no performance impact, and constantly maintains performance and reliability by utilizing idle resources whenever they are available. Such solutions have now become available and should be implemented in all enterprises.

Advertisement. Scroll to continue reading.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement

You May Also Like

Advertisement

ecoustics is a hi-fi and music magazine offering product reviews, podcasts, news and advice for aspiring audiophiles, home theater enthusiasts and headphone hipsters. Read more

Copyright © 1999-2024 ecoustics | Disclaimer: We may earn a commission when you buy through links on our site.



SVS Bluesound PSB Speakers NAD Cambridge Audio Q Acoustics Denon Marantz Focal Naim Audio RSL Speakers