Tag Archives: Backup

5 Hot Enterprise Backup and Recovery Vendors


The backup and recovery market has become a crowded space, with hundreds of vendors vying for market share. At the higher end of the market, the enterprise data center segment, the bar is higher and the result is that just a handful of software vendors command most of the sales.

With most tape drive vendors exiting the market, support of other backup media has become essential to maintaining a vendor’s business. Most initially pushed for hard disk-based backup, but the latest trend is to offer cloud storage solutions as well.

In what had become a somewhat stale and undifferentiated market, both HDD/SSD and cloud opened up new opportunities and something of a “space race” has occurred in the industry over the last few years. Backup and recovery vendors have added compression and deduplication, which can radically reduce the size of a typical backup image. This is important when data is moved to a remote storage site via WAN links, since these have lagged well behind compute horsepower and LAN bandwidth.

Many backup and recovery packages create a backup gateway that stores the backup at LAN speeds and then send it off across the WAN at a more leisurely pace. The benefit is a reduced backup window, though with some risk of data loss if the backup is corrupted prior to completing the move to the remote site.

Today, the target of choice for backup data is the cloud. It’s secure, very scalable and new low-traffic services cost very little to rent. The backup gateway encrypts all data so backups are hack-proof, though not necessarily deletion-proof, which requires action by the cloud service provider to provide storage types with only a well-protected manual deletion path.

Continuous data protection (CDP) is one of the hot backup services today; it manifests as either server-side snapshots or high-frequency polling by backup software for changed objects. Using these approaches reduces the data loss window, though it can hurt performance. SSDs help solve most of the performance issues, but daytime WAN traffic will increase.

Noting that access to backup storage tends to occur within just a few hours of the backup itself, some of the newcomers to the space offer a caching function, where data already moved to the remote site is held in the backup gateway for a couple of days. This speeds recovery of cached files.

With applications such as Salesforce, MS Office and Exchange common in the enterprise, optimizations capabilities to enable backup without disrupting operations are common features among the main players in datacenter backup. Many vendors also now offer backup for virtual machines and their contents and container backup will no doubt become common as well.

There is a school of thought that says that continuous snapshots, with replicas stored in the cloud, solve both backup and disaster recovery requirements, but there are issues with this concept of perpetual storage, not least of which is that a hacker could delete both primary data and the backups. Not paying your cloud invoice on time can do that, too! The idea is attractive, however, since license fees for software mostly disappear.

Readers are likely familiar with “old-guard” established backup and recovery vendors such as Veritas, Commvault, Dell EMC, and IBM. In this slideshow, we look at five of up-and-coming vendors, in alphabetical order, that are driving innovation in enterprise backup and recovery.

(Image: deepadesigns/Shutterstock)



Source link

Is the Network Part of Your Data Backup Strategy?


Make sure to include the network in your data protection planning.

A data backup strategy is the backbone of any enterprise IT shop. Businesses need to protect their data from application or server failures, as well as improper data manipulation, deletion or destruction through accidental or nefarious methods such ransomware.

In planning their backup strategy, companies can overlook the network as part of the overall design. Distributed and server-to-cloud backups rely on the underlying network to move data from point A to B in a timely and secure manner. Therefore, it makes sense to include the network as an integral part of any data backup and recovery strategy. I’ll discuss four ways to do that.

Network redundancy

The first and most obvious step is to verify that your network maintains a proper level of end-to-end resiliency. Whether you are talking about local, off-site or backups to cloud service providers, the network should be designed so that there are no single points of failure that could potentially render a data backup or restore useless. A single point of failure refers to a device or link that, if it fails, will bring down all or a large portion of a LAN.

Also, consider how automated your network failover prevention mechanisms are. Traditional network redundancy techniques include dynamic routing protocols, HSRP/VRRP, VPN and WAN carrier diversity. More recently, SDN, SD-WAN and multi-cloud management are beginning to be included as part of a forward-thinking data backup roadmap.

Network baselining

Data backups have the potential to consume a tremendous amount of throughput. The major concern is that certain links along the way will become congested to the point that it negatively impacts other applications and users on the network. Avoiding network congestion by using a separate network that’s purpose-built for backups is cost prohibitive. Most enterprises perform backups using the same network hardware and links as their production traffic.

Consequently, a key step in any backup strategy is to properly baseline traffic across the network to determine how backups will impact link utilization. Understanding data flows and throughput requirements of data backups along with current utilization baselines over time allows engineers to design a backup strategy that will not impact daily operations. In some cases, this means that the timing of backups occur outside of network peak hours. In other situations, it will require upgrading the throughput capacity of certain network links along a backup path.

Once a backup plan is in place, it’s necessary to continue to monitor link utilization using NetFlow and SNMP tools to ensure that bottlenecks don’t creep up on you over time.

QoS

Another way to mitigate the impact backups can have on a shared network links is to leverage quality of service (QoS) techniques. Using QoS, we can identify, mark and ultimately prioritize traffic flows as they traverse a network. Large companies with highly complex networks and backup strategies often opt to mark and prioritize data backups at a lower class. This is so more critical, time-sensitive applications, such as voice and streaming video, take priority and freely traverse the network when link congestion occurs.

Backup packets are queued or dropped according to policy and will automatically  transmit when the congestion subsides. This allows for round-the-clock backups without the need for strict off-hours backup windows and alleviates concern that the backup process will impair production traffic that shares the same network links.

Data security

No conversation about backups is complete without discussing data security. From a network perspective, this includes a plan for extending internal security policies and tools out to the WAN and cloud where off-site backups will eventually reside.

Beyond these data protection basics, network and security administrators must also battle shadow IT. Shadow IT is becoming a serious problem that affects the safety and backup/restore capabilities of corporate data. Backups are only useful when they collect all critical data. Shadow IT is preventing this from happening because data is increasingly being stored in unauthorized cloud applications.

Tools such as NetFlow and cloud access security broker (CASB) platforms can help track down and curb the use of shadow IT. A CASB can monitor traffic destined to the Internet and control what cloud services employees can use.



Source link

Backup and Recovery Software: IT Pros Weigh In


How can enterprise IT professionals know which data backup and recovery software to choose for their business? There are numerous products on the market for this critical data center function.

Peer reviews published by real users facilitate this software decision-making with user feedback, insight, and product rankings that collectively indicate which solutions are in the lead. With this knowledge, potential users are equipped to choose the product offering best-suited to their organizational needs.

Based on real user reviews at IT Central Station, these five products are some of the top data backup and recovery solutions on the market. The reviews from IT pros provide valuable insight into the products’ benefits and shortcomings.

Veeam Backup

Chris C., a systems engineer at a business law firm, shared this perspective: “With moving the Veeam server to a physical server and creating a proxy server on each of the hosts, we are able to leverage SAN-based backup, which is very fast. Jobs are completed overnight and never run into the business hours.”

Alberto Z., a senior IT engineer at a tech company, noted an area for improvement: “Determining the space for the WAN acceleration server sometimes is hard, especially if you have many source sites for the data. I would like to have a kind of storage space calculator that gives me an estimate for the size of the WAN accelerator server we are creating; give it a list of VMs to be backed up.”

Read more Veeam Backup reviews by IT Central Station users.

HPE Data Protector

Darren O., systems engineer at a biotech company, provided this review of HPE Data Protector: “The granularity of brick-level restore functionality is very valuable. We receive approximately 10 restore requests on a daily basis for your typical file/folder restore, with the odd Exchange mailbox restore request thrown in, just to keep me on my toes.”

A systems and data services leader at a financial services firm who goes by the handle HeadOfSy6f42 said he would like to have more capacity. “This can be done by having more deduplication and compression. If they can compress the data more and more, we will save more space,” he noted.

Read more HPE Data Protector reviews by IT Central Station users.

Asigra

Guy N., CEO at a tech services firm, cited  two primary improvements in the Asigra platform with the recent version 13.1 SP1:

  • “Virtualization: a tremendous variety of data protection solutions for virtual machines.
  • Cloud-2-Cloud: Office 365, Google, Amazon, etc. This is a full package of data protection platform!””

He also provided insight about the Asigra’s cost and licensing features:

“It is important to be perfectly knowledgeable about Asigra’s pricing model. It is somewhat more complex that other backup vendors, but it includes a huge opportunity for savings, especially with their recovery license model (RLM).”

Read more Asigra reviews by IT Central Station users. 

Altaro VM Backup

IT support engineer Vasileios P. offered this view: “Simplicity and reliability. I had some difficulties activating the product, but after the activation phase all went smooth…I could create VM backups from live machines without any issues. The restore process also was very quick.”

However, Chaim K., CEO of a tech services company, said he needs “to be able to restore emails to Exchange Live not just to a PST. This is a major drawback as I want to be able to restore individual items or mailboxes directly into my live Exchange databases so the user can see the email right away.”

Read more Altaro VM Backup reviews by IT Central Station users.

Commvault

Dan G., senior system administrator for a healthcare organization, wrote that Commvault’s “most valuable feature is the ability to backup over the dedicated Fiber Channel directly from SAN. There’s no impact to the network or users…Backups happen overnight instead of three days. Storage for backups has been reduced by 60%.”

He added that the “bare-metal restore needs some work. It’s not intuitive and seems to have been an afterthought.”

Read more Commvault reviews by IT Central Station users.



Source link