Monthly Archives: October 2017

Linux Foundation Certified System Administrator and Engineer: Lars Kronfält | Linux.com


The Linux Foundation offers many resources for developers, users, and administrators of Linux systems. One of the most important offerings is its Linux Certification Program. The program is designed to give you a way to differentiate yourself in a job market that’s hungry for your skills.

In this series of articles, we consider how well certification prepares you for the real world.  To illustrate that, the Linux Foundation we are highlighting the experiences of some people who’ve recently passed the certification examinations. These testimonials should serve to help you decide if either the Linux Foundation Certified System Administrator or the Linux Foundation Certified Engineer certification is right for you. In this article, we talk with recently certified Lars Kronfält.

Linux.com: How did you become interested in Linux and open source?

Lars Kronfält: My first encounter with Linux was back in the late 1990s. I had an Amiga growing up, exchanging floppy disks to share things. Running services on Linux and connecting computers in a network made a deep impression. Realizing that it was free to use and community-driven got me even more interested. The openness and accessibility of information backed by great minds collaborating really had me hooked.

Linux.com: What Linux Foundation course did you achieve certification in? Why did you select that particular course?

Kronfält: Both LFCS and LFCE. I wanted to show that I still know the craftsmanship of managing Linux even though I use tools to automate everything; I only write things once and often work in front of a whiteboard. Certification from Linux Foundation is a stamp of quality and I’m proud to say that I have those certificates, and I often ask certification-like questions when we are hiring.

Linux.com: What are your career goals? How do you see Linux Foundation certification helping you achieve those goals and benefiting your career?

Kronfält: I work in a DevOps context with half a foot in Infrastructure Architecture. My career is probably heading towards Enterprise Architecture. I think that the LF certifications, in combination with my ITIL and CITA certifications, shows a broad body of knowledge which will be beneficial for my career.

Linux.com: What other hobbies or projects are you involved in? Do you participate in any open source projects at this time?

Kronfält: Outside of work and family I try to find time for endurance sports, art, culture, and programing. I have hobby projects where I do as much programing as possible. My daytime job is for a B2B company with client specific closed source software. I am neither privately nor professionally engaged in an open source project at this time, but contributing when it fits.

Linux.com: Do you plan to take future Linux Foundation courses? If so, which ones?

Kronfält: Yes, I really enjoy them and think that they are great both for the direct learning and as a sources of reference. For example the MOOCs like Introduction to DevOps: Transforming and Improving Operations is a great resource in improving one’s way of working. Before trying to move code to a cloud, courses like Introduction to Kubernetes provide great guidelines.

Linux.com: In what ways do you think the certification will help you in today’s job market?

Kronfält: From my point of view certifications are beneficial and show domain knowledge, understanding of best practices and design patterns. That knowledge is helpful in finding common ground with peers and stakeholders.

Linux.com: What Linux distribution do you prefer and why?

Kronfält: My personal preference is for something Debian-based, but at work the main distribution has been CentOS for a while. For me, it doesn’t matter much. My basic needs for a workstation are a terminal, an editor, and a browser; the rest is just icing on the cake, especially when automation increases the level of abstraction. About BootProcess, let’s just say that I’m old school. Our main road ahead looks cloudy, filled with containers, and our base image right now is Ubuntu-based. At the same time, I’m playing around with minimized immutable servers. Well, I do dabble.

Linux.com: Are you currently working as a Linux systems administrator? If so, what role does Linux play?

Kronfält: Linux is a huge part of the foundation to all I do. I work at a software vendor. Basically everything we build is on Linux. Almost everything we use runs on Linux.

Linux.com: Where do you see the Linux job market growing the most in the coming years?

Kronfält: Hard to pinpoint where it will grow the most. It will grow in all markets. The future for Linux professionals looks bright.

Linux.com: What advice would you give those considering certification for their preparation?

Kronfält: Run a virtual environment and test out the Domains and Competencies for each certification. Search the web for examples but try to complete them without using the web. Reset and restart, do it again. Read man pages. Ask a friend. Find a mentor. Practice makes perfect.

Linux.com: If you have found employment in the IT industry, do you feel like your certification was beneficial?

Kronfält: It has been beneficial. Even though I’ve been in the business for a long time (and have proven my skills) in many ways, the LF certifications show my interest and level of seriousness in my relation to Linux. Personally, I really like LF and being certified makes me happier. Passion and grit go a long way.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.

Read more:

Linux Foundation Certified System Administrator: Gabriel Rojo Argote

Linux Foundation LFCE Georgi Yadkov Shares His Certification Journey

Linux Foundation LFCS and LFCE: Pratik Tolia

Linux Foundation Certified Engineer: Gbenga “Christopher” Adigun

Linux Foundation Certified Engineer: Karthikeyan Ramaswamy

Linux Foundation Certified System Administrator: Muneeb Kalathil

Linux Foundation Certified System Administrator: Theary Sorn

Linux Foundation Certified Engineer: Ronni Jensen

Enterprises Gearing Up for IoT


For a few years now, experts have been advising enterprises to prepare their IT infrastructure for an onslaught of connected devices and it appears companies are paying heed. According to a study by 451 Research, enterprises are planning to boost storage capacity, networking, and other infrastructure to accommodate the increased data produced by their internet of things projects.

The study, which polled 575 IT and IoT decision makers worldwide, found that organizations are making changes to their IT resources to support their IoT projects. Specifically, they’re planning to increase storage capacity (32%), network edge equipment (30%), server infrastructure (29%), and off-premises cloud infrastructure (27%) over the next year.

451 Research analysts said “the collection, storage, transport and analysis of IoT data is impacting all aspects of IT infrastructure.”

More than half of the IT pros surveyed reported that their companies initially store and analyze IoT data at a company-owned data center. “IoT data remains stored there for two-thirds of organizations, while nearly one-third of the respondents move the data to a public cloud,” analysts said.

About 65% of those surveyed, which are based mostly in North America and Europe, said they’re planning to increase their spending on IoT projects in the next 12 months. Less than 3% plan to reduce their IoT spending, according to 451 Research’s Voice of the Enterprise: Internet of Things – Workloads and Key Projects.

 

451 Research supplemented its web-based survey with 11 in-depth phone interviews with IoT IT managers and C-level executives

Top IoT use cases include data center management and surveillance and security monitoring, but in two years, facilities automation is likely to be the main use case, analysts said.

Many of the companies surveyed said they process IoT data at the edge, either on the IoT device or in nearby IT infrastructure.

“Companies are processing IoT workloads at the edge today to improve security, process real-time operational action triggers, and reduce IoT data storage and transport requirements,” Rich Karpinski, research director at 451 Research, said in a prepared statement. While some say that they plan to conduct deeper data analytics at the network edge as well, most of the heavy data processing is happening in company-owned data centers or public cloud, he added.

Nearly half of those polled reported having a hard time finding workers with IoT skills, according to the report. Data analytics, security, and virtualization are the skills most in demand, according to the report.

A separate study released late last month by satellite communications company Inmarsat also found an IoT skills shortage. According to the global survey of 500 senior IT decision-makers conducted by Vanson Bourne, 60% of those polled said they need more cybersecurity staff to handle the deluge of IoT data and 46% said they lacked staff with experience in analytics and data science. Nearly half said they lacked technical support skills required for successful IoT deployments.

 



Source link

Choosing a Cloud Provider: 8 Storage Considerations


Amazon Web Services, Google, and Azure dominate the cloud service provider space, but for some applications it may make sense to choose a smaller provider specializing in your app class and able to deliver a finer-tuned solution. No matter which cloud provider you choose, it pays to look closely at the wide variety of cloud storage services they offer to make sure they will meet your company’s requirements.

There are two major classes of storage with the big cloud providers, which offer local instance storage with selected instances, as well as a selection of network storage options for permanent storage and sharing between instances.

As with any storage, performance is a factor in your decision-making process. There are many shared network storage alternatives, including storage tiers from really hot to freezing cold and within the top tiers, differences depending on choice of replica count, and variations in prices for copying data to other spaces.

The very hot tier is moving to SSD and even here there are differences between NVMe and SATA SSDs, which cloud tenants typically see as IOPS levels. For large instances and GPU-based instances, the faster choice is probably better, though this depends on your use case.

At the other extreme, the cold and “freezing” storage, the choices are disk or tape, which impacts data retrieval times. With tape, that can take as much as two hours, compared with just seconds for disk.

Data security and vendor reliability are two other key considerations when choosing a cloud provider that will store your enterprise data.  Continue on to get tips for your selection process.

(Image: Blackboard/Shutterstock)



Source link

5 Disaster Recovery Tips: Learning from Hurricanes


Hurricanes Irma and Harvey highlight the need for DR planning to ensure business continuity.

 

This has been an awful year for natural disasters, and yet, we’re not even midway through a hurricane season that’s been particularly devastating. Hurricanes Irma and Harvey, and the flooding that ensued, has resulted in loss of life, extensive property damage, and crippled infrastructure..

Naturally, businesses have also been impacted. When it comes to applications, data and data centers, this is a wake-up call. At the same time, these are situations that motivate companies and individuals to introduce much-needed change. With this in mind, I’ll offer five tips any IT organization can use to become more resilient against natural disaster, no matter the characteristics of their systems and data centers. This can lead to better availability of critical data and tools when disaster strikes, continuity in serving customers, as well as peace of mind knowing preparations have been made and work can continue as expected.

1. Keep your people safe

When a natural disaster is anticipated (if there is notice), IT staffers need to focus on personal and family safety issues. Having to work late to take one more backup off-site shouldn’t be part of the last-minute process. Simply put, no data is worth putting lives at risk. If the rest of these tips are followed, IT staff won’t have to scramble in the heavy push of preparation to tie up loose ends of what already should be a resilient IT strategy.

2. Follow the 3-2-1 rule

In my role, I’ve long advocated the 3-2-1 rule, and we need to keep reiterating it: Have three different copies of important data saved, on two different media, one of these being off-site. Embrace this rule if you haven’t already. There are two additional key benefits of the 3-2-1 rule: It doesn’t require any specific technology and can address nearly any failure scenario.

3. 10 miles may not be enough

My third tip pertains to the off-site recommendation above. Many organizations believe the off-site copy or disaster recovery facility should be at least 10 miles away. This no longer may be sufficient; the path and fallout of a hurricane can be wide-reaching. Moreover, you want to avoid having personnel spend unnecessary time in a car traveling to complete the IT work. Cloud technologies can provide a more efficient and safer solution. This can involve using disaster recovery as a service (DRaaS) from a service provider or simply putting backups in the cloud.

4. Test your DR plan

Ensure that when a disaster plan is created there is particular focus on anticipating and eliminating surprises. This should involve regularly testing of backups to be certain they are completely recoverable, that the plan will function as expected and all data is where it needs to be (off-site, for example). The last thing you want during a disaster is to find that the plan hasn’t been completely implemented or run in months, or worse, discover there are workloads which are not recoverable.

5. Communications planning

My final recommendation is to work backwards in all required systems and with providers of all types to ensure you don’t have risks you can’t fix. Pay close attention to geography in relation to your own facilities, as well as country locations for data sovereignty considerations. This can apply to telecommunications providers, too. A critical component about response to any disaster is that organizations are able to communicate. Given what has happened in some locations in the path of Hurricane Irma, even cellular communication can be unreliable. Consider developing a plan to ensure communications in the interim if key business systems are down.

The recent flood and hurricane damage has been significant. The truth is, when it comes to the data, IT services, and more, there is a significant risk a business may never recover if it’s not adequately prepared. We live in a digitally transformed world and many businesses can’t operate without the availability of systems and data. These simple tips can bring about the resiliency companies need to effectively handle disasters, and prove their reliability to the customers they serve.

Rick Vanover is director of technical product marketing for Veeam Software.



Source link

Void Linux: A Salute to Old-School Linux | Linux.com


I’ve been using Linux for a very long time. Most days I’m incredibly pleased with where Linux is now, but every so often I wish to step into a time machine and remind myself where the open source platform came from. Of late, I’ve experimented with a few such distributions, but none have come as close as to what Linux once was than Void Linux.

Void Linux (created in 2008) is a rolling release, general purpose Linux distribution, available for Intel, ARM, and MIPS architectures. Void offers a few perks that will appeal to Linux purists:

  • Void isn’t a fork of another distribution.

  • Void uses runit as the init system.

  • Void replaced OpenSSL with LibreSSL (due to the Heartbleed fiasco).

  • Void uses its own, built-from scratch, package manager (called xbps).

Most of all, Void makes you feel like you’re using Linux of old (especially if you opt for the Xfce take on the desktop). With Void, you can opt to download a release with one of the following desktops:

  • Xfce

  • Cinnamon

  • Enlightenment

  • Lxde

  • Lxqt

You can also download a GUI-less version and install your desktop of choice.

With the exception of Cinnamon, the options are all focused on creating a very lightweight desktop. To that end, Void Linux will run very well on your hardware. I should make mention here that working with Void Linux in VirtualBox is an exercise in frustration. I use VirtualBox for all my testing purposes and Void does not play well with the VirtualBox ADDONS. Because of this, Void runs terribly slow in VirtualBox (even after following the Void Linux official instructions on successful host installation). With that warning in check, if you want to test Void Linux, install it on a desktop machine and save yourself an hour or two of hair pulling.

That old-school installation

Regardless of what Void Linux desktop you opt to install, you’re going to get a taste of what it was like to install Linux “back in the day”. No it’s not a perfect recreation, but it’s close enough. So download Void Linux, with your desktop of choice, and get ready.

When you boot the live ISO image, you will find yourself on whatever desktop you’ve chosen. One thing you won’t find is a tried-and-true Install icon on the desktop, for simplified installation. Oh no. The installation of Void is handled through the terminal window, thanks to a lovely ncurses-based system.

Upon boot, you must open up a terminal window, su to the root user (the default root user password is voidlinux), and then issue the command void-installer. This will fire up the ncurses-based installer, where you must walk through the various installation steps (Figure 1).

You can use the arrow keys on your keyboard to move up and down and hit Enter to select a menu entry to configure. However, if you just hit Enter on the first entry, and then configure that option, you will automatically be moved down to the next step. Most of these steps are very intuitive. It’s not until you get to the Partitioning and Filesystems that you might find cause to raise an eyebrow. Of course, any user who remembers the process of installing Linux from the early days shouldn’t have a problem with these steps. But if you’re used to, say, the Ubuntu installer (that makes the installation of the platform as simple as installing an application), you might have trouble.

When you reach the partition section of the installation (Figure 2), you’ll want to tab down to New, hit Enter, and then define the size for the partition. Mark the partition bootable, tab to Write, and hit Enter (on your keyboard).

Once the partition is written, tab to Quit and hit Enter. In the filesystem section (Figure 3), you must first select a filesystem type and then specify the mount point.

The mount point for your filesystem will most likely be /. Enter that in the section to specify the mount point for /dev/sda1 (Figure 4), tab down to OK, and then hit Enter.

Once you have your filesystem and mount point taken care of, you can then move down to Install and run the installer. This section will take about two minutes. When the installation completes, you can then reboot and enjoy your newly installed Void Linux distribution.

Post installation

With Void Linux installed, you’ll find a fairly minimum set of tools available. Out of the box, there is no office suite, no email client, no image editor, not even a graphical package manager. What you have is a barebones desktop, with a nice command line installation tool, that allows you to install exactly what you want.

What many Linux faithful will appreciate the most about Void Linux is that it opts for runit, over systemd. The runit system is incredibly fast and easily configured. For example, where systemd requires complex run scripts, runit can start a process with a single line of code. That not only makes runit very easy to configure, but goes a long way to speeding up the process. For more information on runit, check out the official page.

If you don’t happen to like the desktops offered by Void, you can install, say, GNOME using the xbps-install command like so:

xbps-install -S gnome

It just so happens, the version of GNOME available to the Void Linux repositories is 3.26, so you’re getting the latest greatest GNOME desktop. There are thousands of other applications you can install on Void. You can query the package manager like so:

xbps-query -Rs PACKAGENAME

Where PACKAGENAME is the name of the software you want to find.

Who should enter the Void?

I can’t say I’d recommend Void Linux to just anyone. In fact, I think it’s safe to say that new-to-Linux users need not apply. Out of the box, Void doesn’t really offer enough in the way of user-facing applications to appease the new crowd. And because there isn’t a GUI package manager, new users would find themselves frustrated very quickly.

However, if you’re wise to the ways of Linux (especially the command line), Void is a refreshing change from the same ol’ same ol’. Void offers just the right amount of old-school Linux to make you feel like you’ve traveled back in time, while still able to maintain enough modernity to remain current.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.