It has always been an IT dream: as many computer operations as possible need to be transparent. Since the eighties, the word (in a similar way to “paradigm shift”) has been bandied about and used liberally by software publishers and vendors, and all one has to do is search the latest in software news to discover that transparency in software, especially system software, is still a critical issue. At the least updates, virus scans and other routine system tasks should operate transparently, with no attention from IT or users.
The fact that “transparent” is still an issue comes as no surprise. Firstly, many of the applications and utilities previously promoted as “transparent”, weren’t quite. They would cause performance glitches or slow down applications, or in some cases they would completely freeze up a system attempting to operate “in the background.” Hence in some quarters the goal is still yet to be met, developers are still trying, and IT is still waiting.
But secondly, thanks to a combination of increasing technology complexity, lack of experienced personnel and the need by many corporations to operate 24X7, datacenter activity has hit near critical mass. It’s a constant struggle just to accomplish implementation of new applications and new technologies such as virtualization, NAS and SAN. Having to manually perform or even set schedules for routine system tasks, quite apart from being outdated, just means extra work on the backs of already-overwhelmed IT personnel.
One prime example is scheduled disk defragmentation–a task which has been around for so long that many may consider it “transparent.” But a closer look will show this to be anything but true: systems must be analyzed for performance bottlenecks due to fragmentation, and schedules for defragmentation must be set accordingly. Not only does this require considerable IT personnel time, it requires experience with a system and its operations–something many personnel don’t have these days. It can be clearly seen that disk defragmentation, along with other such tasks, needs to be truly transparent.
As an aside, one other problem with scheduled defragmentation is that it can no longer keep up with today’s frantic rates of fragmentation–in between scheduled runs, fragmentation is continuing to occur and impact performance. Also, because of today’s higher-capacity disks and larger file sizes, there are times when scheduled fragmentation isn’t actually defragmenting at all.
Due to constant innovation, it well behooves IT directors and system administrators to stay abreast of the latest in transparent technology. Following the example of defragmentation, completely transparent defragmentation solutions now exist which require no scheduling and operate invisibly in the background using only idle system resources. Solutions for other such tasks are coming into the market every day. Use them to unload routine tasks off the backs of system personnel.