Monthly Archives: March 2018

Linux Foundation LFCS: Ahmed Alkabary |

The Linux Foundation offers many resources for Linux and open source developers, users, and administrators. One of the most important offerings is its Linux Certification Program, which is designed to give you a way to differentiate yourself in a job market that’s hungry for your skills.

How well does the certification prepare you for the real world? To illustrate that, the Linux Foundation will be featuring some of those who have recently passed the certification examinations. These testimonials should help you decide if either the Linux Foundation Certified System Administrator or the Linux Foundation Certified Engineer certification is right for you. In this installment of our series, we talk with Ahmed Alkabary.

An introduction

Alkabary writes, “I want to share my experience with the LFCS as I do believe it’s a unique one.  It all started by my getting the award of Academic Aces in 2016 LIFT and then I took a free exam coupon to LFCS…” How did you become interested in Linux and open source?

Ahmed Alkabary: I always knew about Linux as an alternative to Windows, but never really got to experience it until 2011. I decided to buy a new laptop, and the laptop that stood out for me had Linux pre-installed on it. I remember well the pre-installed distribution was openSUSE. I was hesitant to buy it as I had no experience with Linux whatsoever, but I thought to myself, Well, I can just install windows on it if I don’t like it. Once I booted the system and saw how fast and neat everything was, I thought it is a message from the Linux gods.

It’s really weird because on my first day I felt that Linux was meant for me not just as an operating system to use, but I felt my life will be centered around Linux from that day.

I was a first-year computer science student at the time, so I quickly developed a passion for operating systems. I immediately started experimenting with Linux by installing different distros and trying to understand the filesystem, as well as everything behind Linux itself. I was treating Windows as an operating system that I use just to check email and do google searches, but Linux made me think about operating systems in a whole different level. It’s like driving a Ferrari, and suddenly you get super excited about cars and how they work!  Also Linux being free was a huge factor for me to become interested in it. What Linux Foundation course did you achieve certification in? Why did you select that particular course?

Alkabary: I did get my LFCS certification, and I chose this certification because It’s a very important and a prestigious certification to achieve if one is seriously considering a career in Linux. Also it has all the fundamentals covered. More importantly, the LFCS exam is very hands on, which makes it leagues better than other Linux certs that are multiple choice based. And so earning the LFCS certification makes me feel that am up to any task that I take on at my job. Since this exam is hands on, it’s not like cram multiple choice exams that really don’t verify any skill besides memorization.

Employers can rest assured that anyone who passes this exam has a solid understanding of LInux and can be a trustworthy Linux sysadmin. One other advantage is that

the exam is online. Instead of traveling to a testing center, the test can be done from the comfort of your own room, on your favorite chair, on your favorite computer. What are your career goals? How do you see The Linux Foundation certification helping you achieve those goals and benefiting your career?

Alkabary: I am currently working as a junior Linux administrator at ISM Canada (an IBM company). My career goal is to become a senior Linux administrator/Kernel Developer. My ultimate goal is to become someone who advocates for Linux and to become a pioneer of this awesome piece of software. The Linux certification makes me feel more confident with my skills and makes me feel like am able to reach all these goals I’ve set for myself. I will prepare for the LFCE exam, which will make me even much more comfortable with Linux and will go a long way to ensuring more success at my current job (as every question I had in the LFCS exam was basically a task that I had to do in my position). Some questions even made me realize I was doing certain things incorrectly at work. What other hobbies or projects are you involved in? Do you participate in any open source projects at this time?

Alkabary: I am very interested in the Linux kernel. I am currently learning about it and want to get into Linux driver development and Cgroups. It is a very steep learning curve and quite complicated compared to Linux administration, and there aren’t many helpful resources. Within this realm,  The Linux Foundation made it easier by offering a course on Linux kernel internal development. I recently read an article that talks about how Linux kernel skills are very scarce and are in huge demand at the moment. I believe The Linux Foundation should create a Linux kernel development certification, which would be a serious breakthrough, because many more would get interested in developing for the Linux kernel. A certification program will make it much easier for kernel enthusiasts to contribute to the kernel project. Do you plan to take future Linux Foundation courses? If so, which ones?

Alkabary: Yes, I am planning to take the LFCE and also am very happy about the partnership with Microsoft, as I can now take the Linux on Azure certification, which is a joint certification between The Linux Foundation and Microsoft. At work, we recently implemented an Azure stack, so It will definitely help me quite a lot taking Linux on Azure certification.

In what ways do you think the certification will help you as a systems administrator in today’s market?

Alkabary: It will help me verify my Linux skills and will make me more confident and excited about career goals. All the exam objectives are basically part of my everyday work tasks that get assigned to me at work. So passing the exam makes me feel I am better at my job. Also LFCS is one of two steps to achieve the Microsoft Azure MCSA (Linux on Azure) and we do implement Azure solutions here at ISM Canada, so getting an MCSA will definitely be a huge asset (and will definitely help me contribute with a greater impact and help to make me a leader within my organization within a very few months of employment).

Are you currently working as a Linux systems administrator? If so, what role does Linux play?

Alkabary: Yes, I am currently working as a Linux sysadmin in a mid-range environment.  My job is pretty much centered around Linux. I build and patch Linux servers as well as performing maintenance-related Linux tasks and administering a wide array of Unix/Linux servers.

What Linux distribution do you prefer and why?

Alkabary: I would have to say openSUSE tumbleweed, as it is kind of my first Linux love. It is very beautifully designed and I also like to work with YAST, as well as the zypper package manager.

But I also like Fedora, as it is based on Red Hat which most of my work revolves around. So I would say openSUSE is my favorite hobby distro and Fedora is my fave professional distro.

Where do you see the Linux job market growing the most in the coming years?

Alkabary: I see it developing on the cloud, just like Linux is the most used OS on Azure, Amazon, and OpenStack. I would also say more growth will occur within the mobile world as well (we all know that android is based on Linux). But I do can see Linux continuing its dominance within the cloud and mobile in the coming years for sure. And that’s not neglecting the fact that Linux is growing in popularity for personal use everyday and so it’s becoming more popular as a desktop as well.

What advice would you give those considering certification for their preparation?

Alkabary: There are free Linux Foundation courses on edX, so that would be a great starting point. The LFS 201 course on edX is a great preparation course as well. I also used Sander Van Vugt’s LFCS series, which is really good also.

I highly recommend that everyone take the LFCS and LFCE exams. It will open doors, and will verify their Linux skills, and last but not least it’s probably redundant to say now but Linux skills are in great demand. So a job is almost guaranteed with LFCS and LFCE certification.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.

Read more:

Linux Foundation LFCS and LFCE: Alberto Bullo

Linux Foundation LFCS and LFCE: Miltos Tsatsakis

Linux Foundation Certified System Administrator: Gabriel Rojo Argote

Linux Foundation LFCE Georgi Yadkov Shares His Certification Journey

Linux Foundation LFCS and LFCE: Pratik Tolia

Linux Foundation Certified Engineer: Gbenga “Christopher” Adigun

Linux Foundation Certified Engineer: Karthikeyan Ramaswamy

Linux Foundation Certified System Administrator: Muneeb Kalathil

How to Encrypt Files From Within a File Manager |

The Linux desktop and server enjoys a remarkable level of security. That doesn’t mean, however, you should simply rest easy. You should always consider that your data is always a quick hack away from being compromised. That being said, you might want to employ various tools for encryption, such as GnuPG, which lets you encrypt and decrypt files and much more. One problem with GnuPG is that some users don’t want to mess with the command line. If that’s the case, you can turn to a desktop file manager. Many Linux desktops include the ability to easily encrypt or decrypt files, and if that capability is not built in, it’s easy to add.

I will walk you through the process of encrypting and decrypting a file from within three popular Linux file managers:

Installing GnuPG

Before we get into the how to of this, we have to ensure your system includes the necessary base component… GnuPG. Most distributions ship with GnuPG included. On the off chance you use a distribution that doesn’t ship with GnuPG, here’s how to install it:

  • Ubuntu-based distribution: sudo apt install gnupg

  • Fedora-based distribution: sudo yum install gnupg

  • openSUSE: sudo zypper in gnupg

  • Arch-based distribution: sudo pacman -S gnupg

Whether you’ve just now installed GnuPG or it was installed by default, you will have to create a GPG key for this to work. Each desktop uses a different GUI tool for this (or may not even include a GUI tool for the task), so let’s create that key from the command line. Open up your terminal window and issue the following command:

gpg --gen-key

You will then be asked to answer the following questions. Unless you have good reason, you can accept the defaults:

Once you’ve answered these questions, type y to indicate the answers are correct. Next you’ll need to supply the following information:

  • Real name.

  • Email address.

  • Comment.

Complete the above and then, when prompted, type O (for Okay). You will then be required to type a passphrase for the new key. Once the system has collected enough entropy (you’ll need to do some work on the desktop so this can happen), your key will have been created and you’re ready to go.

Let’s see how to encrypt/decrypt files from within the file managers.


We start with the default GNOME file manager because it is the easiest. Nautilus requires no extra installation or extra work to encrypt/decrypt files from within it’s well-designed interface. Once you have your gpg key created, you can open up the file manager, navigate to the directory housing the file to be encrypted, right-click the file in question, and select Encrypt from the menu (Figure 1).

You will be asked to select a recipient (or list of recipients — Figure 2). NOTE: Recipients will be those users whose public keys you have imported. Select the necessary keys and then select your key (email address) from the Sign message as drop-down.

Notice you can also opt to encrypt the file with only a passphrase. This is important if the file will remain on your local machine (more on this later). Once you’ve set up the encryption, click OK and (when prompted) type the passphrase for your key. The file will be encrypted (now ending in .gpg) and saved in the working directory. You can now send that encrypted file to the recipients you selected during the encryption process.

Say someone (who has your public key) has sent you an encrypted file. Save that file, open the file manager, navigate to the directory housing that file, right-click the encrypted file, select Open With Decrypt File, give the file a new name (without the .gpg extension), and click Save. When prompted, type your gpg key passphrase and the file will be decrypted and ready to use.


On the KDE front, there’s a package that must be installed in order to encrypt/decrypt from with the Dolphin file manager. Log into your KDE desktop, open the terminal window, and issue the following command (I’m demonstrating with Neon. If your distribution isn’t Ubuntu-based, you’ll have to alter the command accordingly):

sudo apt install kgpg

Once that installs, logout and log back into the KDE desktop. You can open up Dolphin and right-click a file to be encrypted. Since this is the first time you’ve used kgpg, you’ll have to walk through a quick setup wizard (which self-explanatory). When you’ve completed the wizard, you can go back to that file, right-click it (Figure 3), and select Encrypt File.

You’ll be prompted to select the key to use for encryption (Figure 4). Make your selection and click OK. The file will encrypt and you’re ready to send it to the recipient.

Note: With KDE’s Dolphin file manager, you cannot encrypt with a passphrase only.

If you receive an encrypted file from a user who has your public key (or you have a file you’ve encrypted yourself), open up Dolphin, navigate to the file in question, double-click the file, give the file a new name, type the encryption passphrase, and click OK. You can now read your newly decrypted file. If you’ve encrypted the file with your own key, you won’t be prompted to type the passphrase (as it has already been stored).


The Thunar file manager is a bit trickier. There aren’t any extra packages to install; instead, you need to create new custom action for Encrypt. Once you’ve done this, you’ll have the ability to do this from within the file manager.

To create the custom actions, open up the Thunar file manager and click Edit > Configure Custom Actions. In the resulting window, click the + button (Figure 5) and enter the following for an Encrypt action:

Name: Encrypt

Description: File Encryption

Command: gnome-terminal -x gpg –encrypt –recipient %f

Click OK to save this action.

NOTE: If gnome-terminal isn’t your default terminal, substitute the command to open your default terminal in.

You can also create an action that encrypts with a passphrase only (not a key). To do this, the details for the action would be:

Name: Encrypt Passphrase

Description: Encrypt with Passphrase only

Command: gnome-terminal -x gpg -c %f

You don’t need to create a custom action for the decryption process, as Thunar already knows what to do with an encrypted file. To decrypt a file, simply right-click it (within Thunar), select Open With Decrypt File, give the decrypted file a name, and (when/if prompted) type the encryption passphrase. Viola, your encrypted file has been decrypted and is ready to use.

One caveat

Do note: If you encrypt your own files, using your own keys, you won’t need to enter an encryption passphrase to decrypt them (because your public keys are stored). If, however, you receive files from others (who have your public key) you will be required to enter your passphrase. If you’re wanting to store your own encrypted files, instead of encrypting them with a key, encrypt them with a passphrase only. This is possible with Nautilus and Thunar (but not KDE). By opting for passphrase encryption (over key encryption), when you go to decrypt the file, it will always prompt you for the passphrase.

Other file managers

There are plenty of other file managers out there, some of them can work with encryption, some cannot. Chances are, you’re using one of these three tools, so the ability to add encryption/decryption to the contextual menu is not only possible, it’s pretty easy. Give this a try and see if it doesn’t make the process of encryption and decryption much easier.

Data Protection in the Public Cloud: 6 Steps

While cloud security remains a top concern in the enterprise, public clouds are likely to be more secure than your private computing setup. This might seem counter-intuitive, but cloud service providers have a leverage of scale that allows them to spend much more on security tools than any large enterprise, while the cost of that security is diluted across millions of users to fractions of a cent.

That doesn’t mean enterprises can hand over all responsibility for data security to their cloud provider. There are still many basic security steps companies need to take, starting with authentication. While this applies to all users, it’s particularly critical for sysadmins. A password compromise on their mobiles could be the equivalent of handing over the corporate master keys. For the admin, multi-factor authentication practices are critical for secure operations. Adding biometrics using smartphones is the latest wave in the second or third part of that authentication; there are a lot of creative strategies!

Beyond guarding access to cloud data, what about securing the data itself? We’ve heard of major data exposures occurring when a set of instances are deleted, but the corresponding data isn’t. After a while, these files get loose and can lead to some interesting reading. This is pure carelessness on the part of the data owner.

There are two answers to this issue. For larger cloud setups, I recommend a cloud data manager that tracks all data and spots orphan files. That should stop the wandering buckets, but what about the case when a hacker gets in, by whatever means, and can reach useful, current data? The answer, simply, is good encryption.

Encryption is a bit more involved than using PKZIP on a directory. AES-256 encryption or better is essential. Key management is crucial; having one admin with the key is a disaster waiting to happen, while writing down on a sticky note is going to the opposite extreme. One option offered by cloud providers is drive-based encryption, but this fails on two counts. First, drive-based encryption usually has only a few keys to select from and, guess what, hackers can readily access a list on the internet. Second, the data has to be decrypted by the network storage device to which the drive is attached. It’s then re-encrypted (or not) as it’s sent to the requesting server. There are lots of security holes in that process.

End-to-end encryption is far better, where encryption is done with a key kept in the server. This stops downstream security vulnerabilities from being an issue while also adding protection from packet sniffing.

Data sprawl is easy to create with clouds, but opens up another security risk, especially if a great deal of cloud management is decentralized to departmental computing or even users. Cloud data management tools address this much better than written policies. It’s also worthwhile considering adding global deduplication to the storage management mix. This reduces the exposure footprint considerably.

Finally, the whole question of how to backup data is in flux today. Traditional backup and disaster recovery has moved from in-house tape and disk methods to the cloud as the preferred storage medium. The question now is whether a formal backup process is the proper strategy, as opposed to snapshot or continuous backup systems. The snapshot approach is growing, due to the value of small recovery windows and limited data loss exposure, but there may be risks from not having separate backup copies, perhaps stored in different clouds.

On the next pages, I take a closer look at ways companies can protect their data when using the public cloud.

(Image: phloxii/Shutterstock)

Source link

Site Reliability Engineering: 4 Things to Know

In 2016, Google published a book called “Site Reliability Engineering: How Google Runs Production Systems” that extolled a new approach to managing IT infrastructure. In Google’s words, site reliability engineering, or SRE for short, is “what you get when you treat operations as if it’s a software problem.”

That definition seems to align very closely with the DevOps movement, which aims, in part, to bring agile software development approaches to infrastructure management. People involved in DevOps teams have become increasingly interested in SRE and how it might help them become more collaborative and agile.

To find out more about site reliability engineering, Network Computing spoke with Rob Hirschfeld, who has been in the cloud andinfrastructure space for nearly 15 years, including work with early ESX betas and serving on the Open Stack Foundation Board. Hirschfeld, cofounder and CEO of RackN, will present “DevOps vs SRE vs Cloud Native” at Interop ITX 2018.

We asked him to explain some of the basics of SRE, and what infrastructure pros need to know about this new concept. He highlighted four key facts:

1. SRE is a job function that started at Google

“Site reliability engineering is a term that was coined by Google to describe their engineering operations group,” Hirschfeld said. “It’s basically a job functions that spans multiple disciplines on the operations side of Google. They are responsible not only for data center operations, but going up all the way to interacting with application developers and some of their key internet properties to analyze them, do performance management — basically take a sustained application into an ongoing full lifecycle deployment.”

2. SRE complements DevOps approaches and cloud-native architecture

Hirschfeld explained that DevOps, SRE, and cloud-native apply similar philosophies to different aspects of IT. DevOps is “about people and culture and process, SRE is “a job function,” and cloud-native is “an architectural pattern that describes how applications are built in a way that makes them more sustainable and runnable in the cloud,” he said.

He added, “It fits very cleanly together where we have an architectural pattern, a job function, a process management description — all three tie together to really create the way modern application development works.”

3. SRE offers greater reliability and performance

In Hirschfeld’s words, SRE “supercharges a company’s operational experiences.”

He said that by embracing SRE, companies are “placing a high priority on sustaining engineering and making sure their site is up and running and performing well, and that they are not so focused on adding a feature that might hurt the customer experience in the end by being unreliable or slow.”

He also noted that while many organizations have very high regard for their developers, that hasn’t always been true for IT operations personnel. SRE can equalize the influence and respect afforded to development and operations staff.

4. SRE requires commitment

The one big downside of SRE is that it “takes a bit of commitment,” Hirschfeld said. “If the company is used to letting the operations team fight fires all the time and move from crisis to crisis, the SRE team is going to slow down those process while it cleans house, while it fixes the backlog of problems and builds a more repeatable process.” That process can be discouraging, but he encourages organizations not to give up.

He also noted, “If you just throw SRE onto a team that’s not empowered as an SRE team, you will not be that successful at all. It’s not something you should do halfway.”

In conclusion, he re-emphasized the connections among DevOps, SRE, and cloud-native. “You can’t succeed at SRE without thinking about DevOps, without thinking about cloud-native architecture as well,” he said. “They all go hand-in-hand.”

Get live advice on networking, storage, and data center technologies to build the foundation to support software-driven IT and the cloud. Attend the Infrastructure Track at Interop ITX, April 30-May 4, 2018. Register now!


Source link

Top Trends Impacting Enterprise Infrastructure

Enterprise infrastructure teams are under massive pressure as the cloud continues to upend traditional IT architectures and ways of providing service to the business. Companies are on a quest to reap the speed and agility benefits of cloud and automation, and infrastructure pros must keep up.

In this rapidly changing IT environment, new technologies are challenging the status quo. Traditional gear such as dedicated servers, storage arrays, and network hardware still have their place, but companies are increasingly looking to the cloud, automation, and software-defined technologies to pursue their digital initiatives.

According to IDC, by 2020, the heavy workload demands of next-generation applications and IT architectures will have forced 55% of enterprises to modernize their data center assets by updating their existing facilities or deploying new facilities.

Moreover, by the end of next year, the need for better agility and manageability will lead companies focused on digital transformation to migrate more than 50% of their IT infrastructure in their data center and edge locations to a software-defined model, IDC predicts. This shift will speed adoption of advanced architectures such as containers, analysts said.

Keith Townsend, founder of The CTO Advisor and Interop ITX Infrastructure Track Chair, keeps a close eye the evolution of IT infrastructure. On the next pages, read his advice on what he sees as the top technologies and trends for infrastructure pros today: hyperconvergence, network disaggregation, cloud migration strategies, and new abstraction layers such as containers.

(Image: Timofeev Vladimir/Shutterstock)

Get live advice on networking, storage, and data center technologies to build the foundation to support software-driven IT and the cloud. Attend the Infrastructure Track at Interop ITX, April 30-May 4, 2018. Register now!


Source link