When you get down to it, an agent is just a little piece of software that performs a task on a server or desktop. Yet the method of using an agent has flooded forums with hundreds of posts. How could something so small create such a fierce debate? Like many IT choices, people tend to pick a side—agent or agentless. With two choices there is naturally a division, though the differences really come down to a few key elements:
- Agent-based backups use one or more agents on each endpoint that is to be backed up. Since the agent lives at the kernel level of the OSstack, it requires a system reboot to install.
- Agentless backup is actually a misnomer. Even with agentless backup, there’s at least one agent on a host machine that centrally manages operations for other endpoints. Unlike agent-based backups, agentless backups don’t require a reboot because they inject code directly into a running system (a technique called hooking, which is also employed by cyberattackers to install malware).
But this debate goes far beyond where a little piece of software lives. There are legitimate pros and cons to both methods. Below are some of the advantages and disadvantages to each.
- Less Administration
You don’t have to install agents on every server or machine, and upgrades and new releases don’t require updates to multiple agents.
- No Downtime for Installation
Agentless solutions are easier to implement because they don’t require a system reboot, which helps reduce downtime and work hours.
- Effective for Virtualization
Because the single agent lives on the host machine all major processes are performed by the hypervisor.
- Lower Costs
With agent-based solutions users must often pay for each agent. Scaling an agent-based solution across an enterprise can—in some cases—be more costly.
- Resource Intensive
Agentless solutions must transmit application commands as well as backed up data between devices. This can bog down many networks if they’re already near bandwidth capacity.
- Less Reliable
Data is most vulnerable in transit. Since agentless options often take longer to transmit data (see above), data is at greater risk of corruption or loss.
- Most Reliable
Agent-based backups are well-known for their reliability because they have much more control over the host system. Since agents reside at kernel level, they have more direct access to data changes on disk sectors. This results in much quicker and more reliable backups. Because of a tighter integration with Microsoft’s Volume Shadow Copy Service (VSS), agent-based backups can create what are known as application-consistent backups. Basically, the VSS framework stops applications from writing to the disk for the short period it takes to capture a snapshot. This approach prevents data loss that could happen if data were midstream when the snapshot occurs.
- Better for Highly Transactional VMs
Agents are particularly good for highly transactional virtual machines (VMs) that have, say, a database running something like SQL or Exchange. Since the VSS framework can pause these transactions to take a snapshot, there’s less likelihood of errors or data loss.
- Less Resource Intensive
Since each machine has its own agent installed in the OS stack, backup operations will use the machine’s local resources. This means agent-based backups use less overall network bandwidth.
- Faster Data Transfer
Agent-based backups rely on the computing resources of the machine that’s being backed up—not the host. With pre-processing and compression provided by an agent-based solution transmission speeds are often improved.
- Potentially Higher Costs
As noted, licensing for agent-based solutions might differ from agentless solutions. Physical and virtual machines all need their own agent and most vendors require a license for each.
- Potential Downtime and Maintenance Hassles
Since admins must reboot systems to install an agent, they force installation downtime and require time to roll out, particularly over large networks.
StorageCraft: The Best of Both Worlds
For many admins, reliability alone is enough to make agent-based solutions the better choice. But it’s not quite that simple. Most enterprises have hybrid environments and often benefit from both agent-based backups for physical machines as well as agentless backups for virtualized machines. For example, a mission-critical VM can be protected by an agent while non-critical VMs can be protected with an agentless approach. As with any technology, admins must evaluate their own needs and decide which is best.
For enterprises that want reliable backups and the flexibility to choose when to use or not use agents, StorageCraft offers some of the most effective solutions available. Here are a few reasons:
Tight VSS Integration for Reliability
StorageCraft ShadowXafe and ShadowProtect are tightly integrated with VSS, which can pause applications while a snapshot occurs. That is particularly useful for highly transactional systems like SQL databases.
Sector-Level Backups for Smaller Footprint
Using an agent often results in a smaller backup footprint and therefore requires less storage. After the first backup, solutions like ShadowXafe and ShadowProtect create incremental backups. Because these solutions look for changes at the sector-level (a sector is typically 512 bytes or 4096 bytes), the resulting incremental backups take up much less space than one created using a different technology. For example, when deploying ShadowXafe agentless protection in a VMware environment, ShadowXafe uses VMware’s tool to help create the backups. The smallest section in which this VMware tool can check for changes is 64KB. For an environment with files that change frequently, a sector-level backup will be more efficient.
Get Best of Both Worlds: Flexibility for Agent and Agentless Backup
StorageCraft has you covered by giving you the option to use agentless backups for data that isn’t mission critical—offering the simplicity of host-based backups which are often easier to manage across dozens of virtual machines—and agents for those that are.