Tag Archives: Backup

Why You Should Consider Using Azure Backup | IT Infrastructure Advice, Discussion, Community


In the digital transformation journey, businesses and organizations are producing a huge amount of data that is critical for many business applications. However, accompanying this digital data is also the potential risk of damage due to unexpected events or outage. Few businesses can afford to suffer the loss of data, and they can face financial, legal, and other repercussions as a result. 

To address the risk of data loss, backup and recovery processes are carried out to ensure the data is safely replicated to an environment other than the original storage location. Various options are available to back up data. It can be on-premises servers or various cloud backup solutions, including leading public cloud providers. On-premise backups have many issues related to management, but with the cloud, it is possible to store data remotely.

Cloud storage backup has emerged as a key option in the last few years. Backing up data to the cloud holds a redundant copy of data that can be remotely accessible from anywhere in case of outage or disaster to make sure business operations stay active. Storing data in the cloud can significantly improve business recovery duration and allows you to restore data to the native environment more quickly.

Considering Microsoft Azure for Data Backup

Microsoft Azure has gained momentum over the last five years, providing cloud services at scale for businesses. Azure Backup is a backup-as-a-service (BaaS) solution integrated into its public cloud services to businesses. It allows you to back up vital business data and customize your service to deal with a variety of operational events. Azure can back up local files and folders, as well as SharePoint, Exchange, and SQL server data.

Before you decide to move to Azure Backup, certain considerations have to be addressed. This includes assessing your storage and recovery requirements, selecting components for backup, choosing resources while keeping an eye on pricing, and understanding how backup and recovery processes will be implemented.

Azure’s backup solution is quite similar to its counterparts in other public cloud domains. However, certain benefits are more specific to Azure. Let us look at which are the more prominent key factors that businesses should look at before they go ahead with Azure Backup.

Hybrid cloud for heterogeneous storage

Azure is initially orientated towards DevOps, Internet of Things (IoT), and app development. Microsoft Azure focuses primarily on use cases based on hybrid cloud architecture. This offers businesses to stick to an on-premises data center for backup while also synchronizing their data in the cloud.

Image source: https://azure.microsoft.com/en-us/blog/windows-server-system-state-backup-azure/

Privacy and security

Privacy and security concerns are paramount for businesses in the digital era. Azure Backup stands out in this regard, offering many built-in security measures to ensure data is safely residing in the Azure cloud and isolated from other hosted business data. Azure has an authenticated process include a passphrase, which is not stored with the cloud, to prevent a breach. 

In the event of an attempted breach or vulnerability attack, an automated system informs users to take action before the attacker can reach the backup system. Azure also automatically create and store recovery points for last 14 days to recover anytime from the last state.

Pricing

Businesses strive towards saving CAPEX and OPEX while transitioning towards digitally-enabled business. Azure Backup offers an affordable pay-as-you-use method for cloud usage that has become the norm in the cloud services domain. Azure specifically offers flexible pricing for businesses as per the requirements. Unlike its counterparts that use a flat rate for storage resources for backup, Azure charges according to the storage consumed at billing time. You can get more information about Azure’s pricing structure on this page.

Scaling

Azure offers unlimited scaling of server resources in real-time. Therefore, businesses need not worry about the infrastructure provided as part of the backup service. Data backup thus remains highly available for applications, and there is no need for monitoring as well.

Unlimited data transfer

Azure has no restriction on the amount of data transferred, either inbound or outbound. With this unlimited data transfer offering, there is no change in pricing, speed, or availability of services. That is an added benefit for businesses in terms of scalability and affordability.

Data Retention

Azure offers short and long-term data retention services, called Recovery Service vaults. Although there is no limit enforced by Azure to retention, the user can specify and define a retention policy for their data.

Ways of storage

Azure offers two ways of replication to keep data highly available. One is Locally Redundant Storage (LRS), which creates copies of data three times in the storage rack. All copies reside on the same premises. It is a low-cost option that protects data from hardware resource glitches. The second method is Geo-Redundant Storage (GRS). It is the default option and copies data at a geographical distance from the source of data. This option is recommended as it protects data from possible natural disasters.

Application-Specific Backup

Another crucial benefit of Azure backup is that you can have an application-specific backup. This can help you avoid the need for additional fixes while restoring data from a file server, virtual machines, or SQL servers.

Flexibility in restoring data

Azure backup offers Restore-as-a-Service, which you can use to restore system state files from Azure without many changes. Also, it is possible to apply a system state to Windows servers with the Windows Server Backup utility.

Summary

Most features of Azure are shared with the major competitors, including AWS and the Google Cloud Platform. However, some features are specific to use cases that are an essential part of the existing IT infrastructure of businesses. As most organizations are still using Windows OS as part of their IT ecosystem, Azure Backup seems to be the obvious option to choose. It can be an easy decision for businesses to shift copies of data workloads to Azure Backup.

 



Source link

Evaluating Backup Bandwidth with a Portable Wan Emulator | IT Infrastructure Advice, Discussion, Community


I’ve been using WAN emulators for well over 20 years and tried all sorts of Windows, Linux, and appliances.

Personally, I prefer an appliance instead of trying to build my own since I don’t have time to deal with maintaining another computer, figuring out which update messed things up, and wondering how accurate it really is.

Got a call from a client who wanted to locate a backup server offsite and was trying to figure out how much bandwidth the application requires. He was told by the vendor that a 100 Mb link will be fine. After calling several carriers, he found out that some can’t provide a 100 Mb link at the recovery location, and the ones that can provide a 100Mb link want an absurd amount of money.

I spoke to the client about the backup server and how it interacts with the master server and what traffic would be expected. I explained the best approach would be for me to spend some time emulating various bandwidth scenarios between the servers.  The goal would be to determine the bandwidth requirements.

Fortunately, I knew other clients who use various carriers in the proposed area and gathered some latency, packet loss, and bandwidth values.  I chose to use 800 Byte packet pings since the servers use that average packet size.

It is important to understand how your WAN emulator behaves. Some products ask you to input the host and network characteristics and it will predict performance and application time. Unfortunately, this requires some knowledge of all those variables. Others are referred to as ‘modeling’ tools with a slightly different twist in that you can provide real device configurations, logs, or trace files.

In this case, I used my portable Apposite Linktropy  Mini-G WAN emulator because I wanted to use the real hosts and application since I had no time to perform an application baseline.

Methodology and document is critical when conducting these exercises. Mine started with sitting down with the client and having him show me a few important tasks or monitoring exercises. I took notes and screenshots to document what was done and the expected results. We also noted how responsive the application was.

Then I asked the client what bandwidth he would like to emulate and he replied 100 Mb and 10 Mb. I then added that we should use the 1 Gb connection as a reference point and baseline.

I suggested that we start with 10 Mb. We had a discussion to determine the best approach and the impact to the application. Since this was a lab, I had a lot of flexibility that I normally don’t have.

The plan is to disconnect the standby server, forcing all traffic to the master server and disconnecting the second backup Ethernet connection to ensure all traffic went through the single Ethernet port.

After installing the device inline, we confirmed that the server was online and then the client repeated all the tasks we documented previously. This part was interesting because even though it had only been about 30 minutes, he had already changed the process and skipped a few steps. When I showed him what we did earlier, he chuckled and we went through it again.  This documentation can be used if I wanted to repeat the test or automate it with the various programming or macro packages.

At the end of the tests, we both determined that all ran well at 10 Mb, but when we went back to the server we saw a “synchronizing files” along with a list of files. I then asked what that was all about and he said that when the connection between the master and slave is broken and reconnected, the application performs a database synchronization. I then suggested we use that as another test point.

Below is a screenshot of the Linktropy Mini G and our settings for the 10 Mb test. For most of my labs, classes, and engagements the Linktropy Mini-G fits the bill but there are various other models to meet your needs.

The results require a bit of interpretation and the client interview was key in understanding and reporting a conclusion.  At first glance, it looks like 10 Mb is clearly the loser. 

The point that puts this time into context is that the synchronization only occurs when the servers loose connectivity. After performing some quick captures, I determined that the heartbeat packets are sent every 90 seconds and the regular database updates are only approximately 2.5 MB when required. The real determining factor is that there was no performance degradation while we were set for 10 Mb.

Now 10 Mb seems to work with a note that a full database synchronization takes much longer.  The client said that this is not a deal breaker since he can still use the application during the synchronization process with no real performance hit.

Watch the entire process here:



Source link

5 Hot Enterprise Backup and Recovery Vendors


The backup and recovery market has become a crowded space, with hundreds of vendors vying for market share. At the higher end of the market, the enterprise data center segment, the bar is higher and the result is that just a handful of software vendors command most of the sales.

With most tape drive vendors exiting the market, support of other backup media has become essential to maintaining a vendor’s business. Most initially pushed for hard disk-based backup, but the latest trend is to offer cloud storage solutions as well.

In what had become a somewhat stale and undifferentiated market, both HDD/SSD and cloud opened up new opportunities and something of a “space race” has occurred in the industry over the last few years. Backup and recovery vendors have added compression and deduplication, which can radically reduce the size of a typical backup image. This is important when data is moved to a remote storage site via WAN links, since these have lagged well behind compute horsepower and LAN bandwidth.

Many backup and recovery packages create a backup gateway that stores the backup at LAN speeds and then send it off across the WAN at a more leisurely pace. The benefit is a reduced backup window, though with some risk of data loss if the backup is corrupted prior to completing the move to the remote site.

Today, the target of choice for backup data is the cloud. It’s secure, very scalable and new low-traffic services cost very little to rent. The backup gateway encrypts all data so backups are hack-proof, though not necessarily deletion-proof, which requires action by the cloud service provider to provide storage types with only a well-protected manual deletion path.

Continuous data protection (CDP) is one of the hot backup services today; it manifests as either server-side snapshots or high-frequency polling by backup software for changed objects. Using these approaches reduces the data loss window, though it can hurt performance. SSDs help solve most of the performance issues, but daytime WAN traffic will increase.

Noting that access to backup storage tends to occur within just a few hours of the backup itself, some of the newcomers to the space offer a caching function, where data already moved to the remote site is held in the backup gateway for a couple of days. This speeds recovery of cached files.

With applications such as Salesforce, MS Office and Exchange common in the enterprise, optimizations capabilities to enable backup without disrupting operations are common features among the main players in datacenter backup. Many vendors also now offer backup for virtual machines and their contents and container backup will no doubt become common as well.

There is a school of thought that says that continuous snapshots, with replicas stored in the cloud, solve both backup and disaster recovery requirements, but there are issues with this concept of perpetual storage, not least of which is that a hacker could delete both primary data and the backups. Not paying your cloud invoice on time can do that, too! The idea is attractive, however, since license fees for software mostly disappear.

Readers are likely familiar with “old-guard” established backup and recovery vendors such as Veritas, Commvault, Dell EMC, and IBM. In this slideshow, we look at five of up-and-coming vendors, in alphabetical order, that are driving innovation in enterprise backup and recovery.

(Image: deepadesigns/Shutterstock)



Source link

Is the Network Part of Your Data Backup Strategy?


Make sure to include the network in your data protection planning.

A data backup strategy is the backbone of any enterprise IT shop. Businesses need to protect their data from application or server failures, as well as improper data manipulation, deletion or destruction through accidental or nefarious methods such ransomware.

In planning their backup strategy, companies can overlook the network as part of the overall design. Distributed and server-to-cloud backups rely on the underlying network to move data from point A to B in a timely and secure manner. Therefore, it makes sense to include the network as an integral part of any data backup and recovery strategy. I’ll discuss four ways to do that.

Network redundancy

The first and most obvious step is to verify that your network maintains a proper level of end-to-end resiliency. Whether you are talking about local, off-site or backups to cloud service providers, the network should be designed so that there are no single points of failure that could potentially render a data backup or restore useless. A single point of failure refers to a device or link that, if it fails, will bring down all or a large portion of a LAN.

Also, consider how automated your network failover prevention mechanisms are. Traditional network redundancy techniques include dynamic routing protocols, HSRP/VRRP, VPN and WAN carrier diversity. More recently, SDN, SD-WAN and multi-cloud management are beginning to be included as part of a forward-thinking data backup roadmap.

Network baselining

Data backups have the potential to consume a tremendous amount of throughput. The major concern is that certain links along the way will become congested to the point that it negatively impacts other applications and users on the network. Avoiding network congestion by using a separate network that’s purpose-built for backups is cost prohibitive. Most enterprises perform backups using the same network hardware and links as their production traffic.

Consequently, a key step in any backup strategy is to properly baseline traffic across the network to determine how backups will impact link utilization. Understanding data flows and throughput requirements of data backups along with current utilization baselines over time allows engineers to design a backup strategy that will not impact daily operations. In some cases, this means that the timing of backups occur outside of network peak hours. In other situations, it will require upgrading the throughput capacity of certain network links along a backup path.

Once a backup plan is in place, it’s necessary to continue to monitor link utilization using NetFlow and SNMP tools to ensure that bottlenecks don’t creep up on you over time.

QoS

Another way to mitigate the impact backups can have on a shared network links is to leverage quality of service (QoS) techniques. Using QoS, we can identify, mark and ultimately prioritize traffic flows as they traverse a network. Large companies with highly complex networks and backup strategies often opt to mark and prioritize data backups at a lower class. This is so more critical, time-sensitive applications, such as voice and streaming video, take priority and freely traverse the network when link congestion occurs.

Backup packets are queued or dropped according to policy and will automatically  transmit when the congestion subsides. This allows for round-the-clock backups without the need for strict off-hours backup windows and alleviates concern that the backup process will impair production traffic that shares the same network links.

Data security

No conversation about backups is complete without discussing data security. From a network perspective, this includes a plan for extending internal security policies and tools out to the WAN and cloud where off-site backups will eventually reside.

Beyond these data protection basics, network and security administrators must also battle shadow IT. Shadow IT is becoming a serious problem that affects the safety and backup/restore capabilities of corporate data. Backups are only useful when they collect all critical data. Shadow IT is preventing this from happening because data is increasingly being stored in unauthorized cloud applications.

Tools such as NetFlow and cloud access security broker (CASB) platforms can help track down and curb the use of shadow IT. A CASB can monitor traffic destined to the Internet and control what cloud services employees can use.



Source link

Backup and Recovery Software: IT Pros Weigh In


How can enterprise IT professionals know which data backup and recovery software to choose for their business? There are numerous products on the market for this critical data center function.

Peer reviews published by real users facilitate this software decision-making with user feedback, insight, and product rankings that collectively indicate which solutions are in the lead. With this knowledge, potential users are equipped to choose the product offering best-suited to their organizational needs.

Based on real user reviews at IT Central Station, these five products are some of the top data backup and recovery solutions on the market. The reviews from IT pros provide valuable insight into the products’ benefits and shortcomings.

Veeam Backup

Chris C., a systems engineer at a business law firm, shared this perspective: “With moving the Veeam server to a physical server and creating a proxy server on each of the hosts, we are able to leverage SAN-based backup, which is very fast. Jobs are completed overnight and never run into the business hours.”

Alberto Z., a senior IT engineer at a tech company, noted an area for improvement: “Determining the space for the WAN acceleration server sometimes is hard, especially if you have many source sites for the data. I would like to have a kind of storage space calculator that gives me an estimate for the size of the WAN accelerator server we are creating; give it a list of VMs to be backed up.”

Read more Veeam Backup reviews by IT Central Station users.

HPE Data Protector

Darren O., systems engineer at a biotech company, provided this review of HPE Data Protector: “The granularity of brick-level restore functionality is very valuable. We receive approximately 10 restore requests on a daily basis for your typical file/folder restore, with the odd Exchange mailbox restore request thrown in, just to keep me on my toes.”

A systems and data services leader at a financial services firm who goes by the handle HeadOfSy6f42 said he would like to have more capacity. “This can be done by having more deduplication and compression. If they can compress the data more and more, we will save more space,” he noted.

Read more HPE Data Protector reviews by IT Central Station users.

Asigra

Guy N., CEO at a tech services firm, cited  two primary improvements in the Asigra platform with the recent version 13.1 SP1:

  • “Virtualization: a tremendous variety of data protection solutions for virtual machines.
  • Cloud-2-Cloud: Office 365, Google, Amazon, etc. This is a full package of data protection platform!””

He also provided insight about the Asigra’s cost and licensing features:

“It is important to be perfectly knowledgeable about Asigra’s pricing model. It is somewhat more complex that other backup vendors, but it includes a huge opportunity for savings, especially with their recovery license model (RLM).”

Read more Asigra reviews by IT Central Station users. 

Altaro VM Backup

IT support engineer Vasileios P. offered this view: “Simplicity and reliability. I had some difficulties activating the product, but after the activation phase all went smooth…I could create VM backups from live machines without any issues. The restore process also was very quick.”

However, Chaim K., CEO of a tech services company, said he needs “to be able to restore emails to Exchange Live not just to a PST. This is a major drawback as I want to be able to restore individual items or mailboxes directly into my live Exchange databases so the user can see the email right away.”

Read more Altaro VM Backup reviews by IT Central Station users.

Commvault

Dan G., senior system administrator for a healthcare organization, wrote that Commvault’s “most valuable feature is the ability to backup over the dedicated Fiber Channel directly from SAN. There’s no impact to the network or users…Backups happen overnight instead of three days. Storage for backups has been reduced by 60%.”

He added that the “bare-metal restore needs some work. It’s not intuitive and seems to have been an afterthought.”

Read more Commvault reviews by IT Central Station users.



Source link