Tag Archives: 7

7 IT Infrastructure Certifications Gaining Value


IT professionals often wonder whether the time and expense involved in acquiring a certification is worth it. And if it is, which certification should they pursue?

Foote Partners recently released its 2018 IT Skills Demand and Pay Trends Report, which helps answer those questions. Based on data from 3,188 North American employers, the report found that the average market value for the 446 IT certifications it tracks climbed 0.3% in the first quarter of the year. While that doesn’t seem like much, the study also found that on average, having a single certification was worth 7.6% of an IT worker’s base pay.

Drilling down into the data, the report also found that a select group of certifications had gained 10% or more in market value during the six months ending April 1. Several of those certifications are related to IT infrastructure, and those are the certifications highlighted on the following slides.

However, the fact that a given certification has recently increased in value doesn’t necessarily mean that demand is increasing for a given skill or that the trend will continue. The report rightly points out that a lot of different factors can influence supply and demand for certifications, including factors people don’t often consider, such as vendors aggressively marketing certain certifications or overhauling their certification programs.

Foote Partners’ data also revealed that market value volatility for tech skills is leveling out.  The analysts attribute the decreasing fluctuation in pay for various skills to “something more urgent”: the arrival of “game-changing emerging technologies” like blockchain, the internet of things, artificial intelligence, automation, data analytics, and new cybersecurity advances. These areas could see some of the highest employment demand in coming months and years, and vendors haven’t yet created certifications related to some of this newer tech.

Still, for IT professionals who follow career trends, it’s worth noting which certifications are seeing sharp upticks in demand. And according to Foote Partners, the following certs from the networking and communications and systems administration categories gained significant value.

Note: Certifications are arranged in alphabetical order, not in order of relative value.



Source link

7 Enterprise Storage Trends for 2018


Enterprises today are generating and storing more data than ever, and the trend shows no sign of slowing down. The rise of big data, the internet of things, and analytics are all contributing to the exponential data growth. The surge is driving organizations to expand their infrastructure, particularly data storage.

In fact, the rapid growth of data and data storage technology is the biggest factor driving change in IT infrastructure, according to the Interop ITX and InformationWeek 2018 State of Infrastructure study. Fifty-five percent of survey respondents choose it as one of the top three factors, far exceeding the need to integrate with cloud services.

Organizations have been dealing with rapid data growth for a while, but are reaching a tipping point, Scott Sinclair, senior analyst at ESG, said in an interview.

“If you go from 20 terabytes to 100 terabytes, that’s phenomenal growth but from a management standpoint, it’s still within the same operating process,” he said. “But if you go from a petabyte to 10 or 20 petabytes, now you start taking about a fundamentally different scale for infrastructure.”

Moreover, companies today see the power of data and understand that they need to harness it in order to become competitive, Sinclair said.

“Data has always been valuable, but often it was used for a specific application or workload. Retaining data for longer periods was more about disaster recovery, having an archive, or for regulatory compliance,” he said. “As we move more into the digital economy, companies want to leverage data, whether it’s to provide more products and services, become more efficient, or better engage with their customers.”

To support their digital strategy, companies are planning to invest in more storage hardware in their data centers, store more data in the cloud, and investigate emerging technologies such as software-defined storage, according to the 2018 State of Infrastructure study. Altogether, they’re planning to spend more on storage hardware than other infrastructure.

Read on for more details from the research and to find out about enterprise storage plans for 2018. Click on the row of buttons below or on the arrows on either side of the images. For the full survey results, download the complete report.

(Image: Peshkova/Shutterstock)



Source link

Big Data Storage: 7 Key Factors


Defining big data is actually more of a challenge than you might think. The glib definition talks of masses of unstructured data, but the reality is that it’s a merging of many data sources, both structured and structured, to create a pool of stored data that can be analyzed for useful information.

We might ask, “How big is big data?” The answer from storage marketers is usually “Big, really big!” or “Petabytes!”, but again, there are many dimensions to sizing what will be stored. Much big data becomes junk within minutes of being analyzed, while some needs to stay around. This makes data lifecycle management crucial. Add to that globalization, which brings foreign customers to even small US retailers. The requirements for personal data lifecycle management under the European Union General Data Protection Regulation go into effect in May 2018 and penalties for non-compliance are draconian, even for foreign companies, at up to 4% of global annual revenues per affected person.

For an IT industry just getting used to the term terabyte, storing petabytes of new data seems expensive and daunting. This would most definitely be the case with RAID storage array; in the past, an EMC salesman could retire on the commissions from selling the first petabyte of storage. But today’s drives and storage appliances have changed all the rules about the cost of capacity, especially where open source software can be brought into play.

In fact, there was quite a bit of buzz at the Flash Memory Summit in August about appliances holding one petabyte in a single 1U rack. With 3D NAND and new form factors like Intel’s “Ruler” drives, we’ll reach the 1 PB goal within a few months. It’s a space, power, and cost game changer for big data storage capacity.

Concentrated capacity requires concentrated networking bandwidth. The first step is to connect those petabyte boxes with NVMe over Ethernet, running today at 100 Gbps, but vendors are already in the early stages of 200Gbps deployment. This is a major leap forward in network capability, but even that isn’t enough to keep up with drives designed with massive internal parallelism.

Compression of data helps in many big data storage use cases, from removing repetitive images of the same lobby to repeated chunks of Word files. New methods of compression using GPUs can handle tremendous data rates, giving those petabyte 1U boxes a way of quickly talking to the world.

The exciting part of big data storage is really a software story. Unstructured data is usually stored in a key/data format, on top of traditional block IO, which is an inefficient method that tries to mask several mismatches. Newer designs range from extended metadata tagging of objects to storing data in an open-ended key/data format on a drive or storage appliance. These are embryonic approaches, but the value proposition seems clear.

Finally, the public cloud offers a home for big data that is elastic and scalable to huge sizes. This has the obvious value of being always right-sized to enterprise needs and AWS, Azure and Google have all added a strong list of big data services to match. With huge instances and GPU support, cloud virtual machines can emulate an in-house server farm effectively, and make a compelling case for a hybrid or public cloud-based solution.

Suffice to say, enterprises have a lot to consider when they map out a plan for big data storage. Let’s look at some of these factors in more detail.

(Images: Timofeev Vladimir/Shutterstock)



Source link

Big Data Storage: 7 Key Factors


Defining big data is actually more of a challenge than you might think. The glib definition talks of masses of unstructured data, but the reality is that it’s a merging of many data sources, both structured and structured, to create a pool of stored data that can be analyzed for useful information.

We might ask, “How big is big data?” The answer from storage marketers is usually “Big, really big!” or “Petabytes!”, but again, there are many dimensions to sizing what will be stored. Much big data becomes junk within minutes of being analyzed, while some needs to stay around. This makes data lifecycle management crucial. Add to that globalization, which brings foreign customers to even small US retailers. The requirements for personal data lifecycle management under the European Union General Data Protection Regulation go into effect in May 2018 and penalties for non-compliance are draconian, even for foreign companies, at up to 4% of global annual revenues per affected person.

For an IT industry just getting used to the term terabyte, storing petabytes of new data seems expensive and daunting. This would most definitely be the case with RAID storage array; in the past, an EMC salesman could retire on the commissions from selling the first petabyte of storage. But today’s drives and storage appliances have changed all the rules about the cost of capacity, especially where open source software can be brought into play.

In fact, there was quite a bit of buzz at the Flash Memory Summit in August about appliances holding one petabyte in a single 1U rack. With 3D NAND and new form factors like Intel’s “Ruler” drives, we’ll reach the 1 PB goal within a few months. It’s a space, power, and cost game changer for big data storage capacity.

Concentrated capacity requires concentrated networking bandwidth. The first step is to connect those petabyte boxes with NVMe over Ethernet, running today at 100 Gbps, but vendors are already in the early stages of 200Gbps deployment. This is a major leap forward in network capability, but even that isn’t enough to keep up with drives designed with massive internal parallelism.

Compression of data helps in many big data storage use cases, from removing repetitive images of the same lobby to repeated chunks of Word files. New methods of compression using GPUs can handle tremendous data rates, giving those petabyte 1U boxes a way of quickly talking to the world.

The exciting part of big data storage is really a software story. Unstructured data is usually stored in a key/data format, on top of traditional block IO, which is an inefficient method that tries to mask several mismatches. Newer designs range from extended metadata tagging of objects to storing data in an open-ended key/data format on a drive or storage appliance. These are embryonic approaches, but the value proposition seems clear.

Finally, the public cloud offers a home for big data that is elastic and scalable to huge sizes. This has the obvious value of being always right-sized to enterprise needs and AWS, Azure and Google have all added a strong list of big data services to match. With huge instances and GPU support, cloud virtual machines can emulate an in-house server farm effectively, and make a compelling case for a hybrid or public cloud-based solution.

Suffice to say, enterprises have a lot to consider when they map out a plan for big data storage. Let’s look at some of these factors in more detail.

(Images: Timofeev Vladimir/Shutterstock)



Source link

7 Ways to Secure Cloud Storage


Figuring out a good path to security in your cloud configurations can be quite a challenge. This is complicated by the different types of cloud we deploy – public or hybrid – and the class of data and computing we assign to those cloud segments. Generally, one can create a comprehensive and compliant cloud security solution, but the devil is in the details and a nuanced approach to different use cases is almost always required.

Let’s first dispel a few myths. The cloud is a very safe place for data, despite FUD from those who might want you to stay in-house. The large cloud providers (CSPs) maintain a tight ship, simply because they’d lose customers otherwise. Even so, we can assume their millions of tenants include some that are malevolent, whether hackers, government spies or commercial thieves.

At the same time, don’t make the common assumption that CSP-encrypted storage is safe. If the CSP uses drive-based encryption, don’t count on it. Security researchers in 2015 uncovered flaws in a particular hard drive product line that rendered the automatic encryption useless. This is lazy man’s encryption! Do it right and encrypt in the server with your own key set.

Part of the data security story is that data must maintain its integrity under attack. It isn’t sufficient to have one copy of data; just think what would happen if the only three replicas of a set of files in your S3 pool are all “updated” by malware. If you don’t provide a protection mechanism for this, you are likely doomed!

We are so happy with the flexibility of all the storage services available to us that we give scant consideration to what happens to, for example, instance storage when we delete the instance. Does it get erased? Or is it just re-issued? And if erasure is used on an SSD, how can we get over the internal block reassignment mechanism that just moves deleted blocks to the free pool? A tenant using the right software tool can read these blocks. Your CSP may have an elegant solution, but good governance requires you to ask them and understand the adequacy of the answer.

Governance is a still-evolving facet of the cloud. There are solutions for data you store, complete with automated analysis and event reporting, but the rise of SaaS and all the associated flavors of as-a-Service leaves the question of where your data is, and if it is in compliance with your high standards.

The ultimate challenge for cloud storage security is the human factor. Evil admins exist or are created within organizations and a robust and secure system needs to accept that fact and protect against it with access controls, multi-factor authentication, and a process that identifies any place that a single disgruntled employee can destroy valued data. Be paranoid; it’s a case of when, not if!

Let’s dig deeper into the security challenges of cloud storage and ways you can protect data stored in the cloud.

(Image: Kjpargeter/Shutterstock)



Source link