Category Archives: Stiri IT Externe

Hyperconvergence Paves Way To Private Cloud


With its promises of greater data center agility and efficiency, hyperconverged infrastructure has been one of the hottest IT trends in the last couple of years. At Interop ITX, analysts from Enterprise Strategy Group provided insight into the technology and advice for enterprises before they take the plunge.

Hyperconverged systems – as well as their predecessor, converged infrastructure – are something of a flashback to the all-in-one mainframe systems IT shops used to buy from companies like IBM, ESG Analyst Dan Conde said. Mainframes gave way to the best-of-breed, do-it-yourself approach to IT infrastructure

“I liked that world where we could pick and choose and create the system we wanted,” Conde said. This freedom to choose IT components came with a price, though, which has led to interest in converged systems and hyperconvergence.

“We’ve suffered trying to mix and match things and make them work,” Conde said. “It was a pain because of the collective failure of the IT vendor community and standards bodies to make things work in a completely transparent way.”

Converged infrastructure like FlexPod or Vblock combine compute, storage, and networking in one system and bolt them together with management software, ESG Analyst Jack Poller said. “When you have a problem, you have one vendor you can complain to,” he said.

Scaling this type of environment requires adding another full rack. “You still have to do all the work to pull it together,” he told attendees.

Hyperconverged infrastructure take the integrated concept a step further by providing a Lego-like approach to storage, networking, and compute resources and using software to cluster them together into a single pool, Poller said. This enables scaling on a smaller level. “What’s important about hyperconvergence is the software stack,” he added.

Right now, hyperconverged systems, including those from Nutanix and SimpliVity (acquired earlier this year by HPE), are fundamentally about aggregating storage resources, Conde noted. Enabling software-defined networking may require additional specialized software.

For enterprise IT pros under pressure to improve agility and have regulatory compliance requirements that require them to keep hardware on premises, hyperconvergence is a way to build private cloud, Conde said.

“Don’t be afraid. If you want to do something with cloud speed on premises, use hyperconverged infrastructure,” he said.

An ESG survey found that 70% of 308 respondents plan to use hyperconverged infrastructure; 15% use it now. Fifty-six respondents said they plan to use converged infrastructure; 32% have deployed it.

The top drivers for converged and hyperconverged infrastructure adoption include improved service and support, improved scalability, and increased agility of virtual machine provisioning, according to ESG research. “It’s a way to make our private cloud as competitive as the public cloud,” Conde said.

The survey also revealed enterprise concerns with the technologies: Performance concerns around data locality, lack of flexibility when scaling, and vendor lock-in.

ESG also found that many enterprises – 57% of 308 those polled — plan to stick with their traditional best-of-breed approach to IT infrastructure.

While many organizations have deployed or plan to deploy converged/hyperconverged infrastructure, they don’t expect the new systems to completely replace their traditional on-premises infrastructure provisioning, according to ESG.

 



Source link

Samba Security Hole Patched but Risk is Bigger » Linux Magazine


The world barely recovered from the havoc caused by WannaCry ransomware before a new vulnerability was found in the open source Samba networking utility.

According to Samba.org, “All versions of Samba from 3.5.0 onwards are vulnerable to a remote code execution vulnerability, allowing a malicious client to upload a shared library to a writable share, and then cause the server to load and execute it.”

In pure open source tradition, the patch was released immediately, and most Linux distributions have pushed it into their repository.

The real-world situation is grimmer than it appears. First, it’s not a new bug. The bug has been lurking around for the last seven years, since version 3.5.0 was released in 2010. It exposes a serious problem in the Linux world: It doesn’t have enough eyeballs to make all bugs shallow.

The second problem that makes this bug more problematic is that the open source re-implementation of Microsoft’s SMB protocol, which was the culprit in the WannaCry ransomware, is used in every single product that offers any kind of file-sharing capability.

If you have a NAS device, media streaming box, or any device that offers file storage and sharing capability, then it’s more than likely running Samba server on it. Despite running a Linux-based distribution, these devices are not designed for automatic updates and don’t offer users an easy interface to update the packages.

At the same time, in most cases, vendors have no incentive to keep the devices patched, which leaves them vulnerable. If you are aware of this bug and you are running one of these devices, there is literally nothing you can do to fix it, other than unplugging it from the server. The best course of action is to keep an eye on the support site of the product and look for any updates. If updates are available, install them immediately.



Source link

Raspberry Pi Foundation Merges with CoderDojo F… » Linux Magazine


Two open source organizations, the Raspberry Pi Foundation and CoderDojo Foundation, are joining forces. Whereas the Raspberry Pi Foundation is known for their innovative credit-card-sized single-board computers, CoderDojo focuses on exposing young people to computer programming.

According to Philip Colligan, CEO of Raspberry Pi Foundation, “Bringing together Raspberry Pi, Code Club, and CoderDojo will create the largest global effort to get young people involved in computing and digital making.”

It’s not a simple merger. The Raspberry Pi Foundation will become a corporate member of the CoderDojo Foundation, and Colligan will join the CoderDojo board as a director. In return, co-founders of CoderDojo, Bill Liao and James Whelton, will become members of the Raspberry Pi Foundation.

Both organizations will also continue to operate independently. Giustina Mizzoni, Executive Director of CoderDoJo, said that they will remain an independent charity based in Ireland.

“In practical terms, this merger will see our two organisations working closely together to advance our shared goals,” said Mizzoni. “It will enable us to leverage assets and capabilities ultimately driving further value for the CoderDojo Community.”

The Raspberry Pi Foundation will provide practical, financial, and back office support to the CoderDojo Foundation.

“With this extra support we will be able to reach and benefit even more young people globally by investing more time in resource development, community support and growth strategies to make it easier for our volunteers to start and keep running a Dojo in their community,” said Mizzoni.

The merger doesn’t imply that Raspberry Pi will become the exclusive platform for CoderDojo. As always, CoderDojo will remain software and hardware neutral.



Source link

A Primer for Enterprise IT Pros


The buzz around containers, particularly the Docker container platform, is hard to avoid. Containerization of applications promises speed and agility, capabilities that are essential in today’s fast-paced IT environment. But outside the world of DevOps, containers can still be an unfamiliar technology.

At Interop ITX, Stephen Foskett, organizer of Tech Field Day and proprietor of Gestalt IT, provided some clarity about application containerization. In a presentation entitled, “The Case For Containers,” he explained the basics about the technology and what enterprise IT shops can expect from it.

First off, container technology isn’t anything new, he said. “The reason we’re hearing about it is Docker. They’ve done a nice job of productizing it.”

He explained that containers are similar to virtual machines “except for this whole idea of user space.” A container, which uses operating system-level virtualization, has strict boundaries around a limited set of libraries and is custom-designed to run a specific application. That focus on one application is a key differentiator from virtual machines and makes containers important for enterprise IT, he said.

Docker, which launched as an open source project in 2013, “got a lot of things right,” Foskett said. For example, Docker Hub makes it easy to locate images, which become containers when users instantiate them. Docker also uses layered storage, which conserves space. At the same time, though, that easy storage can create lead to performance issues, he added.

Cattle or pets?

Since cloud technologies began altering the IT landscape, cattle vs. pets has become a common meme. “Many in DevOps will tell you they’re [containers] a cattle approach, but they’re not really cattle; they’re pets,” Foskett said.

While containers can be spun up and torn down quickly, the problem is that by default, Docker doesn’t actually destroy the container, which can lead to container sprawl. “When you exit a container, the container stays there with the data as you left it,” unless manually deleted with the rm command, Foskett said.

“If you run a container and stop it, and the image stays around, someone can easily restart the container and access what you were doing,” he said. “That’s probably not a problem on your test laptop, but you can’t do that if you’re engineering a system.”

Another sticky issue for enterprises: It can be difficult to know the origin of images in the Docker Hub. “You can’t guarantee it’s something good,” Foskett said. “Many enterprises aren’t too thrilled with this concept.”

He advised practicing good hygiene when using containers by keeping images simple and using external volume storage to reduce the risk of data exposure. “Then the container itself stays pristine; you don’t have data building up in it.”

Container benefits

One of the main reasons he’s excited, as a system administrator, about containers is that they allow users to specify the entire application environment, Foskett said. A consistent application environment means not having to worry about OS levels, patches, or incompatible applications and utilities

“This is the critical reason containers are going to be relevant in the enterprise data center,” he said.

Another container benefit is security, Foskett said. Security breaches often stem from escalation of privileges to utilities and application components, which affects an entire system. Containerized applications don’t contain unused utilities, so there’s less exposure to infection.

Foskett said containers also enable scalable application platforms using microservices. Instead of monolithic systems that are hard to scale, enterprises can have containerized applications for specific functions.

Getting started

Foskett advised attendees to start experimenting with Docker and Windows containers. “One of the coolest things about Docker is that it’s really easy to try,” he said.

A Docker Enterprise Edition is in the works, which will include certified containers and plugins. When you download a container from Docker Hub, “you know it’s really going to be what it says it is,” he said.

Docker Inc., the company that manages the Docker open source project and the ecosystem around it, has traditionally focused on developers, but has shifted to an enterprise mindset, Foskett said. “They’re addressing concerns we have.”

While real microservices won’t happen for another five to ten years, “the future really is containerized,” Foskett told attendees. “This isn’t just a fad or a trend, but an important movement in IT that has important benefits to people like you and me.”

 

 

 

 

 



Source link

7 Ways to Secure Cloud Storage


Figuring out a good path to security in your cloud configurations can be quite a challenge. This is complicated by the different types of cloud we deploy – public or hybrid – and the class of data and computing we assign to those cloud segments. Generally, one can create a comprehensive and compliant cloud security solution, but the devil is in the details and a nuanced approach to different use cases is almost always required.

Let’s first dispel a few myths. The cloud is a very safe place for data, despite FUD from those who might want you to stay in-house. The large cloud providers (CSPs) maintain a tight ship, simply because they’d lose customers otherwise. Even so, we can assume their millions of tenants include some that are malevolent, whether hackers, government spies or commercial thieves.

At the same time, don’t make the common assumption that CSP-encrypted storage is safe. If the CSP uses drive-based encryption, don’t count on it. Security researchers in 2015 uncovered flaws in a particular hard drive product line that rendered the automatic encryption useless. This is lazy man’s encryption! Do it right and encrypt in the server with your own key set.

Part of the data security story is that data must maintain its integrity under attack. It isn’t sufficient to have one copy of data; just think what would happen if the only three replicas of a set of files in your S3 pool are all “updated” by malware. If you don’t provide a protection mechanism for this, you are likely doomed!

We are so happy with the flexibility of all the storage services available to us that we give scant consideration to what happens to, for example, instance storage when we delete the instance. Does it get erased? Or is it just re-issued? And if erasure is used on an SSD, how can we get over the internal block reassignment mechanism that just moves deleted blocks to the free pool? A tenant using the right software tool can read these blocks. Your CSP may have an elegant solution, but good governance requires you to ask them and understand the adequacy of the answer.

Governance is a still-evolving facet of the cloud. There are solutions for data you store, complete with automated analysis and event reporting, but the rise of SaaS and all the associated flavors of as-a-Service leaves the question of where your data is, and if it is in compliance with your high standards.

The ultimate challenge for cloud storage security is the human factor. Evil admins exist or are created within organizations and a robust and secure system needs to accept that fact and protect against it with access controls, multi-factor authentication, and a process that identifies any place that a single disgruntled employee can destroy valued data. Be paranoid; it’s a case of when, not if!

Let’s dig deeper into the security challenges of cloud storage and ways you can protect data stored in the cloud.

(Image: Kjpargeter/Shutterstock)



Source link