Tag Archives: Enterprise

7 Enterprise Storage Trends for 2018


Enterprises today are generating and storing more data than ever, and the trend shows no sign of slowing down. The rise of big data, the internet of things, and analytics are all contributing to the exponential data growth. The surge is driving organizations to expand their infrastructure, particularly data storage.

In fact, the rapid growth of data and data storage technology is the biggest factor driving change in IT infrastructure, according to the Interop ITX and InformationWeek 2018 State of Infrastructure study. Fifty-five percent of survey respondents choose it as one of the top three factors, far exceeding the need to integrate with cloud services.

Organizations have been dealing with rapid data growth for a while, but are reaching a tipping point, Scott Sinclair, senior analyst at ESG, said in an interview.

“If you go from 20 terabytes to 100 terabytes, that’s phenomenal growth but from a management standpoint, it’s still within the same operating process,” he said. “But if you go from a petabyte to 10 or 20 petabytes, now you start taking about a fundamentally different scale for infrastructure.”

Moreover, companies today see the power of data and understand that they need to harness it in order to become competitive, Sinclair said.

“Data has always been valuable, but often it was used for a specific application or workload. Retaining data for longer periods was more about disaster recovery, having an archive, or for regulatory compliance,” he said. “As we move more into the digital economy, companies want to leverage data, whether it’s to provide more products and services, become more efficient, or better engage with their customers.”

To support their digital strategy, companies are planning to invest in more storage hardware in their data centers, store more data in the cloud, and investigate emerging technologies such as software-defined storage, according to the 2018 State of Infrastructure study. Altogether, they’re planning to spend more on storage hardware than other infrastructure.

Read on for more details from the research and to find out about enterprise storage plans for 2018. Click on the row of buttons below or on the arrows on either side of the images. For the full survey results, download the complete report.

(Image: Peshkova/Shutterstock)



Source link

5 Hot Enterprise Backup and Recovery Vendors


The backup and recovery market has become a crowded space, with hundreds of vendors vying for market share. At the higher end of the market, the enterprise data center segment, the bar is higher and the result is that just a handful of software vendors command most of the sales.

With most tape drive vendors exiting the market, support of other backup media has become essential to maintaining a vendor’s business. Most initially pushed for hard disk-based backup, but the latest trend is to offer cloud storage solutions as well.

In what had become a somewhat stale and undifferentiated market, both HDD/SSD and cloud opened up new opportunities and something of a “space race” has occurred in the industry over the last few years. Backup and recovery vendors have added compression and deduplication, which can radically reduce the size of a typical backup image. This is important when data is moved to a remote storage site via WAN links, since these have lagged well behind compute horsepower and LAN bandwidth.

Many backup and recovery packages create a backup gateway that stores the backup at LAN speeds and then send it off across the WAN at a more leisurely pace. The benefit is a reduced backup window, though with some risk of data loss if the backup is corrupted prior to completing the move to the remote site.

Today, the target of choice for backup data is the cloud. It’s secure, very scalable and new low-traffic services cost very little to rent. The backup gateway encrypts all data so backups are hack-proof, though not necessarily deletion-proof, which requires action by the cloud service provider to provide storage types with only a well-protected manual deletion path.

Continuous data protection (CDP) is one of the hot backup services today; it manifests as either server-side snapshots or high-frequency polling by backup software for changed objects. Using these approaches reduces the data loss window, though it can hurt performance. SSDs help solve most of the performance issues, but daytime WAN traffic will increase.

Noting that access to backup storage tends to occur within just a few hours of the backup itself, some of the newcomers to the space offer a caching function, where data already moved to the remote site is held in the backup gateway for a couple of days. This speeds recovery of cached files.

With applications such as Salesforce, MS Office and Exchange common in the enterprise, optimizations capabilities to enable backup without disrupting operations are common features among the main players in datacenter backup. Many vendors also now offer backup for virtual machines and their contents and container backup will no doubt become common as well.

There is a school of thought that says that continuous snapshots, with replicas stored in the cloud, solve both backup and disaster recovery requirements, but there are issues with this concept of perpetual storage, not least of which is that a hacker could delete both primary data and the backups. Not paying your cloud invoice on time can do that, too! The idea is attractive, however, since license fees for software mostly disappear.

Readers are likely familiar with “old-guard” established backup and recovery vendors such as Veritas, Commvault, Dell EMC, and IBM. In this slideshow, we look at five of up-and-coming vendors, in alphabetical order, that are driving innovation in enterprise backup and recovery.

(Image: deepadesigns/Shutterstock)



Source link

A Primer for Enterprise IT Pros


The buzz around containers, particularly the Docker container platform, is hard to avoid. Containerization of applications promises speed and agility, capabilities that are essential in today’s fast-paced IT environment. But outside the world of DevOps, containers can still be an unfamiliar technology.

At Interop ITX, Stephen Foskett, organizer of Tech Field Day and proprietor of Gestalt IT, provided some clarity about application containerization. In a presentation entitled, “The Case For Containers,” he explained the basics about the technology and what enterprise IT shops can expect from it.

First off, container technology isn’t anything new, he said. “The reason we’re hearing about it is Docker. They’ve done a nice job of productizing it.”

He explained that containers are similar to virtual machines “except for this whole idea of user space.” A container, which uses operating system-level virtualization, has strict boundaries around a limited set of libraries and is custom-designed to run a specific application. That focus on one application is a key differentiator from virtual machines and makes containers important for enterprise IT, he said.

Docker, which launched as an open source project in 2013, “got a lot of things right,” Foskett said. For example, Docker Hub makes it easy to locate images, which become containers when users instantiate them. Docker also uses layered storage, which conserves space. At the same time, though, that easy storage can create lead to performance issues, he added.

Cattle or pets?

Since cloud technologies began altering the IT landscape, cattle vs. pets has become a common meme. “Many in DevOps will tell you they’re [containers] a cattle approach, but they’re not really cattle; they’re pets,” Foskett said.

While containers can be spun up and torn down quickly, the problem is that by default, Docker doesn’t actually destroy the container, which can lead to container sprawl. “When you exit a container, the container stays there with the data as you left it,” unless manually deleted with the rm command, Foskett said.

“If you run a container and stop it, and the image stays around, someone can easily restart the container and access what you were doing,” he said. “That’s probably not a problem on your test laptop, but you can’t do that if you’re engineering a system.”

Another sticky issue for enterprises: It can be difficult to know the origin of images in the Docker Hub. “You can’t guarantee it’s something good,” Foskett said. “Many enterprises aren’t too thrilled with this concept.”

He advised practicing good hygiene when using containers by keeping images simple and using external volume storage to reduce the risk of data exposure. “Then the container itself stays pristine; you don’t have data building up in it.”

Container benefits

One of the main reasons he’s excited, as a system administrator, about containers is that they allow users to specify the entire application environment, Foskett said. A consistent application environment means not having to worry about OS levels, patches, or incompatible applications and utilities

“This is the critical reason containers are going to be relevant in the enterprise data center,” he said.

Another container benefit is security, Foskett said. Security breaches often stem from escalation of privileges to utilities and application components, which affects an entire system. Containerized applications don’t contain unused utilities, so there’s less exposure to infection.

Foskett said containers also enable scalable application platforms using microservices. Instead of monolithic systems that are hard to scale, enterprises can have containerized applications for specific functions.

Getting started

Foskett advised attendees to start experimenting with Docker and Windows containers. “One of the coolest things about Docker is that it’s really easy to try,” he said.

A Docker Enterprise Edition is in the works, which will include certified containers and plugins. When you download a container from Docker Hub, “you know it’s really going to be what it says it is,” he said.

Docker Inc., the company that manages the Docker open source project and the ecosystem around it, has traditionally focused on developers, but has shifted to an enterprise mindset, Foskett said. “They’re addressing concerns we have.”

While real microservices won’t happen for another five to ten years, “the future really is containerized,” Foskett told attendees. “This isn’t just a fad or a trend, but an important movement in IT that has important benefits to people like you and me.”

 

 

 

 

 



Source link

Enterprise Data Storage Shopping Tips


Enterprise data storage used to be an easy field. Keeping up meant just buying more drives from your RAID vendor. With all the new hardware and software today, this strategy no longer works. In fact, the radical changes in storage products impact not only storage buys, but ripple through to server choices and networking design.

This is actually a good news scenario. In data storage, we spent much of three decades with gradual drive capacity increases as the only real excitement. The result was a stagnation of choice, which made storage predictable and boring.

Today, the cloud and solid-state storage have revolutionized thinking and are driving much of the change happening today in the industry. The cloud brings low-cost storage-on-demand and simplified administration, while SSDs make server farms much faster and drastically reduce the number of servers required for a given job.

Storage software is changing rapidly, too. Ceph is the prime mover in open-source storage code, delivering a powerful object store with universal storage capability, providing all three mainstream storage modes (block-IO, NAS and SAN) in a single storage pool. Separately, there are storage management solutions for creating a single storage address space from NVDIMMs to the cloud, compression packages that typically shrink raw capacity needs by 5X, virtualization packages that turn server storage into a shared clustered pool, and tools to solve the “hybrid cloud dilemma” of where to place data for efficient and agile operations.

A single theme runs through all of this: Storage is getting cheaper and it’s time to reset our expectations. The traditional model of a one-stop shop at your neighborhood RAID vendor is giving way to a more savvy COTS buying model, where interchangeability of  component elements is so good that integration risk is negligible. We are still not all the way home on the software side in this, but hardware is now like Legos, with the parts always fitting together. The rapid uptake of all-flash arrays has demonstrated just how easy COTS-based solutions come together.

The future of storage is “more, better, cheaper!” SSDs will reach capacities of 100 TB in late 2018, blowing away any hard-drive alternatives. Primary storage is transitioning to all-solid-state as we speak and “enterprise” hard drives are becoming obsolete. The tremendous performance of SSDs has also replaced the RAID array with the compact storage appliance. We aren’t stopping here, though. NVDIMM is bridging the gap between storage and main memory, while NVMe-over-Fabric solutions ensure that hyperconverged infrastructure will be a dominant approach in future data centers.

With all these changes, what storage technologies should you consider buying to meet your company’s needs? Here are some shopping tips.

(Image: Evannovostro/Shutterstock)



Source link