Tag Archives: Storage

How Software-Define Storage can Empower Developers to Increase Business Value | IT Infrastructure Advice, Discussion, Community


Software developers are now among the most strategic assets of any organization. In today’s fast-paced world, the speed at which one can develop new applications and microservices can dictate whether a company gets to market first or can respond effectively to a sudden competitive move or market shift. In other words, developers are having an unprecedented and direct impact on companies’ – and industries’ – fortunes.

This reality is supported by a 2018 Stripe and Harris Poll study, which predicts software developers’ skillsets alone could add $3 trillion to global GDP over the next decade. Accordingly, 61 percent of C-suite respondents to that study believe access to developer talent is a threat to the success of their business.

Freeing developers to work faster and be more productive

Not surprisingly, organizations aren’t just trying to keep developers focused on what they do best: creating, solving problems, and innovating – they’re also trying to increase their productivity.

Yet, despite the evolving appreciation for developers’ talents, the same study found that many companies are misusing their most important resource. A significant proportion of developers’ time is spent maintaining aging, legacy systems and patching bad software – to the tune of approximately $300 billion per year, with nearly $85 billion being spent addressing bad code alone.

As such, the role of the application architect has emerged in this new world of hybrid platforms to ensure developers’ code runs smoothly, interacts with other services, and makes efficient use of data, regardless of where it is created or consumed.

Meanwhile, development teams are gaining more authority from their line of business managers who realize that their organizations need to harness the immense amount of data they’re collecting and use it for competitive advantage. They want to give developers the ability to provision, and deprovision, resources as they need them, and to develop applications faster than ever before. These managers are prepared to invest in tools that can enable their teams’ success.

The strategic role of storage in agile development

The reality is that developers don’t have time to wait for traditional IT anymore. They need tools and technologies that allow them to work at speed, in an agile manner – supporting, for example, rapid experimentation or the deployment of artificial intelligence (AI), machine learning (ML), and deep learning within their applications.

New methods of accelerating value through application development have emerged in the past few years. While pure public cloud strategies can be quick to deploy, they often lack the performance or governance requirements of other specialized deployments. Hybrid cloud strategies that focus on architecting applications to make the best use of resources, from multicloud, on-premises, remote sites, and even at the device edge, are enabling organizations to enact on data streams at every point in the workflow, greatly optimizing time to value.

Cloud-native application development has grown from largely stateless apps to more stateful applications within distributed systems, requiring the ability to rebalance data, auto-scale, and perform seamless upgrades — all of which can become infinitely easier with persistent, reliable storage.

Exploiting data for competitive advantage

In addition to the flexibility it offers, software-defined storage can help organizations to better harness the value of data, including the continual stream of information and insights gleaned through their applications. Developers and data scientists need to be able to constantly extract, analyze, and react to data to maintain agility, and they can do that more easily with software-defined storage.

Whereas the siloed nature of traditional storage arrays and appliances can inhibit access to data, containerized, open source storage environments facilitate access, regardless of whether data is stored on-premises, at a remote site, at the edge, or in a public or multicloud.

Choosing an IT environment conducive to innovation

This raises a related but important point: many organizations believe the silver bullet to enterprise agility lies in the public cloud. In some cases, this is true, but the public cloud can pose a series of challenges itself. The sum of the “fixes” for these challenges can be costly.

It’s no coincidence that there has been an upsurge in open source container-orchestration systems for application deployment, scaling, and management. Embracing hybrid cloud architecture enables organizations to create flexible infrastructure that suits their diverse business and governance requirements – helping them control costs without sacrificing agility.

Developers must differentiate themselves to stay competitive

Today’s developers are being given unfettered access to the tools and technologies they need to drive innovation and are visibly pushing their organizations and industries forward.

Attracted by growing career opportunities in software and application development, newcomers are flocking into the field – further increasing the pressure on the developer community.

Survival in this highly competitive environment is no small feat. Learning how to differentiate oneself and drive industry disruption consistently takes a high level of skill and determination. Equally, a successful developer needs infrastructure, services, and storage-native solutions that can match the speed of development.



Source link

Storj Opens Its Decentralized Storage Service Project to Beta





Storj Labs has released the beta of its open source namesake decentralized cloud object storage software alongside opening up beta access to its own implementation of that software with its decentralized cloud storage service Tardigrade. Originally the brainchild of founder Shawn Wilkinson, Storj has gone through two other implementations before arriving at version 3 (V3).  (Source: The New Stack)



Haiku OS Picks Up An NVMe Storage Driver


OPERATING SYSTEMS --

Back during the BeOS days of the 90’s, NVM Express solid-state storage obviously wasn’t a thing but the open-source Haiku OS inspired by it now has an NVMe driver.

Haiku that aims to be an open-source OS based off BeOS now has support for NVMe SSDs. This driver didn’t make last September’s Haiku R1 beta but now being found within the latest development code is for NVMe SSD hardware.

As of the latest Haiku code, NVMe SSDs should be fully usable now under their BeOS-inspired operating system. More details via Haiku.org.


Cloud Storage and Policies: How Can You Find Your Way? | IT Infrastructure Advice, Discussion, Community


Cloud storage is one of the hottest topics today. Rightfully so, there seem to be new services being added seemingly daily. Storage services make up one of the most attractive cloud services, so it is only natural to find business problems to solve.

The reality is that storage in the cloud is a whole new discipline. Completely different. Like forget everything you know and let’s start from the beginning. Both Amazon Web Services and Microsoft Azure have many different storage services. Some are like what we have used on-premises, such as Azure File Storage and AWS Elastic Block Store. These resemble traditional file shares and block storage on-premises, yet how they are used can make a very big difference on your experience in the cloud. There are more storage services in the cloud (such as object storage, gateways and more), and they are different than what has traditionally been used on-premises, and that is where it gets interesting.

Let’s first identify why organizations want to leverage the cloud for storage. This may seem a needless step, but it is more critical than ever. The why is very important. The fundamental reason why should be that the cloud is the right platform for the storage need. Supporting reasons will also include cloud benefits such as these:

No upfront purchase: This is different than the on-premises storage practice of purchasing for the future capacity needs (best guesses, overspend or bad misses of targets are common with this practice!).

Effectively unlimited capacity: Ask any mathematician and they will quickly dispute the cloud is not unlimited, but from most customer perspective the cloud will provide effectively unlimited storage options.

Predictable pricing: While not exactly linear, it is pretty clear what consumption pricing will be with cloud storage.

These are some of the good reasons to embrace cloud storage, but beyond the reasons to go to the cloud the strong advice is to look at storage policies and usage to not have any surprises in the future. Some of this includes looking at the economics from a complete scope of use. Too many times pricing is just seen as how much consumption per month. Take AWS S3 for example, for S3 Standard Storage one can have the first 50 TB per month priced at $0.023 per GB (pricing as of March 2019, US East (Ohio) region). But other aspects of using the storage should absolutely be considered. Take for example the following other aspects:

Getting data into the cloud is often overlooked, but there is a cost to that as well. This makes how data is written to the cloud important. Is data sent in small increments (more write operations or put tasks) or in relatively fewer larger increments? This can change the cost profile.

Egress is where data is read from a cloud storage location, and that has a cost. One practical cost is to leverage solutions with cloud storage that retrieve the right pieces; versus entire datasets.

Deleting data Interesting to think about, not for costs per se; but deleting data should be considered. The data in the cloud will live as long as you pay for it, so give thought to ensure no dead data is living in the cloud.

But what can organizations do to manage cloud storage from a policy perspective? In a way, some of the same practices as before can be applied. But also leverage frameworks from the cloud platforms to help manage the usage and consumption. AWS Organizations is a good example for providing policy-based management of multiple AWS accounts. This will streamline account management, billing and control to cloud services. Similar capabilities exist in Azure with Subscription and Service Management along with Azure RBAC.

Between taking a responsible look at new cloud services from what we have learned in the past coupled with what new frameworks are available to use in the cloud, organizations can easily and confidently embrace cloud storage services to not only solve the right platform question, but also manage it in a way that lets CIOs and decision makers sleep at night.



Source link

Assess USB Performance While Exploring Storage Caching | Linux.com


The team here at the Dragon Propulsion Laboratory has kept busy building multiple Linux clusters as of late [1]. Some of the designs rely on spinning disks or SSD drives, whereas others use low-cost USB storage or even SD cards as boot media. In the process, I was hastily reminded of the limits of external storage media: not all flash is created equal, and in some crucial ways external drives, SD cards, and USB keys can be fundamentally different.

Turtles All the Way Down

Mass storage performance lags that of working memory in the Von Neumann architecture [2], with the need to persist data leading to the rise of caches at multiple levels in the memory hierarchy. An access speed gap three orders of magnitude between levels makes this design decision essentially inevitable where performance is at all a concern. (See Brendan Gregg’s table of computer speed in human time [3].) The operating system itself provides the most visible manifestation of this design in Linux: Any RAM not allocated to a running program is used by the kernel to cache the reads from and buffer the writes to the storage subsystem [4], leading to the often repeated quip that there is really no such thing as “free memory” in a Linux system.

An easy way to observe the operating system (OS) buffering a write operation is to write the right amount of data to a disk in a system with lots of RAM, as shown in Figure 1, in which a rather improbable half a gigabyte worth of zeros is being written to a generic, low-cost USB key in half a second, but then experiences a 30-second delay when forcing the system to sync [5] to disk. 

Read more at ADMIN magazine