Monthly Archives: September 2018

Shared Storage with NFS and SSHFS | Linux.com


Up to this point, my series on HPC fundamentals has covered PDSH, to run commands in parallel across the nodes of a cluster, and Lmod, to allow users to manage their environment so they can specify various versions of compilers, libraries, and tools for building and executing applications. One missing piece is how to share files across the nodes of a cluster.

File sharing is one of the cornerstones of client-server computing, HPC, and many other architectures. You can perhaps get away without it, but life just won’t be easy any more. This situation is true for clusters of two nodes or clusters of thousands of nodes. A shared filesystem allows all of the nodes to “see” the exact same data as all other nodes. For example, if a file is updated on cluster node03, the updates show up on all of the other cluster nodes, as well.

Fundamentally, being able to share the same data with a number of clients is very appealing because it saves space (capacity), ensures that every client has the latest data, improves data management, and, overall, makes your work a lot easier. The price, however, is that you now have to administer and manage a central file server, as well as the client tools that allow the data to be accessed.

Although you can find many shared filesystem solutions, I like to keep things simple until something more complex is needed. A great way to set up file sharing uses one of two solutions: the Network File System (NFS) or SSH File System (SSHFS).

Read more at ADMIN Magazine

Deepin Linux: As Gorgeous As It Is User-Friendly | Linux.com


Deepin Linux. You may not have heard much about this distribution, and the fact that it’s often left out of the conversation is a shame. Why? Because Deepin Linux is as beautiful as it is user-friendly. This distribution has plenty of “wow” factor and very little disappointment.

For the longest time, Deepin Linux was based on Ubuntu. But with the release of 15.7, that all changed. Now, Deepin’s foundation is Debian, but the desktop is still that beautiful Deepin Desktop. And when I say it’s beautiful, it truly is one of the most gorgeous desktop environments you’ll find on any operating system. That desktop uses a custom-built QT5 toolkit, which runs as smoothly and with as much polish as any I’ve ever used. Along with that desktop, comes a few task-specific apps, built with the same toolkit, so the experience is consistent and integrated.

What makes the 15.7 release special is that it comes just two short months after the 15.6 release and is focused primarily on performance. Not only is the ISO download size smaller, many core components have been optimized with laptop battery performance in mind. To that end, the developers have gained up to 20 percent better battery life and a much-improved memory usage. Other additions to Deepin Linux are:

  • NVIDIA Prime support (for laptops with hybrid graphics).

  • On-screen notifications (for the likes of turning on or off the microphone and/or Wi-Fi).

  • New drag and drop animation.

  • Added power saving mode and auto-mode switching for laptops.

  • Application categories in mini mode.

  • Full disk installation.

For a full list of improvements and additions, check out the 15.7 Release notes.

Let’s install Deepin Linux and see just what makes this distribution so special.

Installation

In similar fashion to the desktop, the Deepin Linux installer is one of the most beautiful OS installers you will find (Figure 1). Not only is the installer a work of art, it’s incredibly simple. As with most modern Linux distributions, installing Deepin is only a matter of answering a few questions and clicking Next a few times.

Installation shouldn’t take more than 10 minutes tops. In fact, based on the download experience I had with the main download mirror, the installation will go faster than the ISO download. To that end, you might went to pick one of the following mirrors to snag a copy of Deepin Linux:

Once you’ve installed Deepin Linux, you can then log onto your new desktop.

First Steps

Upon first login, you’ll be greeted by a setup wizard that walks you through the configuration of the desktop (Figure 2).

In this wizard, you will be asked to configure the following:

Once you’ve select those options, you’ll find yourself on the Deepin Desktop (Figure 3).

Applications

The application list might surprise some users, especially those who have grown accustomed to certain applications being installed by default. What you’ll find on Deepin Linux is a list of applications that includes:

  • WPS Office

  • Google Chrome

  • Spotify

  • Deepin Store

  • Deepin Music

  • Deepin Movie

  • Steam

  • Deepin Screenshot

  • Foxit Reader

  • Thunderbird Mail

  • Deepin Screen Recorder

  • Deepin Voice Recorder

  • Deepin Cloud Print

  • Deepin Cloud Scan

  • Deepin Font Installer

  • ChmSee

  • Gparted

What the developers have done is to ensure users have as complete a desktop experience as possible, out of the box. In other words, most every average user wouldn’t have to bother installing any extra software for some time. And for those who question the choice of WPS Office, I’ve used it on plenty of occasions and it is quite adept at not only creating stand-alone documents, but collaborating with those who work with other office suites. The one caveat to that is WPS Office isn’t open source. However, Deepin Linux doesn’t promote itself as a fully open desktop, so having closed-source applications (such as the Spotify client and WPS Office) should surprise no one.

Control Center

Deepin takes a slightly different approach to the Control Center. Instead of it being a stand-alone, windowed application, the Control Center serves as a sidebar (Figure 4), where you can configure users, display, default applications, personalization, network, sound, time/date, power, mouse, keyboard, updates, and more.

Click on any one of the Control Center categories and you can see how well the developers have thought out this new means of configuring the desktop (Figure 5).

Hot Corners

The Deepin Desktop also has a nifty hot corners feature on the desktop. With this feature, you can set each corner to a specific action, such that when you hover your mouse over a particular corner, the configured action will occur. Available actions are:

  • Launcher

  • Fast Screen Off

  • Control Center

  • All Windows

  • Desktop

  • None

To set the hot corners, right-click on the desktop and select Corner Settings from the pop-up menu. You can then hover your cursor over one of the four corners and select the action you want associated with that corner (Figure 6).

A Must-Try Distribution

If you’re looking for your next Linux desktop distribution, you’d be remiss if you didn’t give Deepin Linux 15.7 a try. Yes, it is beautiful, but it’s also very efficient, very user-friendly, and sits on top of a rock solid Debian foundation. It’s a serious win-win for everyone. In fact, Deepin 15.7 is the first distribution to come along in a while to make me wonder if there might finally be a contender to drag me away from my long-time favorite distro… Elementary OS.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.

The Future of Open Source | Software


By Jack M. Germain

Sep 19, 2018 5:00 AM PT

Linux and the open source business model are far different today than many of the early developers might have hoped. Neither can claim a rags-to-riches story. Rather, their growth cycles have been a series of hit-or-miss milestones.

The Linux desktop has yet to find a home on the majority of consumer and enterprise computers. However, Linux-powered technology has long ruled the Internet and conquered the cloud and Internet of Things deployments. Both Linux and free open source licensing have dominated in other ways.

Microsoft Windows 10 has experienced similar deployment struggles as proprietary developers have searched for better solutions to support consumers and enterprise users.

Meanwhile, Linux is the more rigorous operating system, but it has been beset by a growing list of open source code vulnerabilities and compatibility issues.

The Windows phone has come and gone. Apple’s iPhone has thrived in spite of stagnation and feature restrictions. Meanwhile, the Linux-based open source Android phone platform is a worldwide leader.

Innovation continues to drive demand for Chromebooks in homes, schools and offices. The Linux kernel-driven Chrome OS, with its browser-based environment, has made staggering inroads for simplicity of use and effective productivity.

Chromebooks now can run Android apps. Soon the ability to run Linux programs will further feed open source development and usability, both for personal and enterprise adoption.

One of the most successful aspects of non-proprietary software trends is the wildfire growth of container technology in the cloud, driven by Linux and open source. Those advancements have pushed Microsoft into bringing Linux elements into the Windows OS and containers into its Azure cloud environment.

“Open source is headed toward faster and faster rates of change, where the automated tests and tooling wrapped around the delivery pipeline are almost as important as the resulting shipped artifacts,” said Abraham Ingersoll, vice president of sales and solutions engineering at
Gravitational.

“The highest velocity projects will naturally win market share, and those with the best feedback loops are steadily gaining speed on the laggards,” he told LinuxInsider.

Advancement in Progress

To succeed with the challenges of open source business models, enterprises have to devise a viable way to monetize community development of reusable code. Those who succeed also have to master the formula for growing a free computing platform or its must-have applications into a profitable venture.

Based on an interesting GitLab report, 2018 is the year for open source and DevOps, remarked Kyle Bittner, business development manager at
Exit Technologies.

That forecast may be true eventually, as long as open source can dispel the security fears, he told LinuxInsider.

“With open source code fundamental to machine learning and artificial intelligence frameworks, there is a challenge ahead to convince the more traditional IT shops in automotive and oil and gas, for example, that this is not a problem,” Bittner pointed out.

The future of the open source model may be vested in the ability to curb worsening security flaws in bloated coding. That is a big “if,” given how security risks have grown as Linux-based deployments evolved from isolated systems to large multitenancy environments.

LinuxInsider asked several open source innovators to share their views on where the open source model is headed, and to recommend the best practices developers should use to leverage different OS deployment models.

Oracle’s OS Oracle

Innovative work and developer advances changed the confidence level for Oracle engineers working with hardware where containers are involved, according to Wim Coekaerts, senior vice president of operating systems and virtualization engineering at Oracle. Security of a container is critical to its reliability.

“Security should be part of how you do your application rollout and not something you consider afterward. You really need to integrate security as part of your design up front,” he told LinuxInsider.

Several procedures in packaging containers require security considerations. That security assessment starts when you package something. In building a container, you must consider the source of those files that you are packaging, Coekaerts said.

Security continues with how your image is created. For instance, do you have code scanners? Do you have best practices around the ports you are opening? When you download from third-party websites, are those images signed so you can be sure of what you are getting?

“It is common today with
Docker Hub to have access to a million different images. All of this is cool. But when you download something, all that you have is a black box,” said Coekaerts. “If that image that you run contains ‘phone home’ type stuff, you just do not know unless you dig into it.”

Yesterday Returns

Ensuring that containers are built securely is the inbound side of the technology equation. The outbound part involves running the application. The current model is to run containers in a cloud provider world inside a virtual machine to ensure that you are protected, noted Coekaerts.

“While that’s great, it is a major change in direction from when we started using containers. It was a vehicle for getting away from a VM,” he said. “Now the issue has shifted to concerns about not wanting the VM overhead. So what do we do today? We run everything inside a VM. That is an interesting turn of events.”

A related issue focuses on running containers natively because there is not enough isolation between processes. So now what?

The new response is to run containers in a VM to protect them. Security is not compromised, thanks to lots of patches in Linux and the hypervisor. That ensures all the issues with the cache and side channels are patched, Coekearts said.

However, it leads to new concerns among Oracle’s developers about how they can ramp up performance and keep up that level of isolation, he added.

Are Containers the New Linux OS?

Some view today’s container technology as the first step in creating a subset of traditional Linux. Coekaerts gives that view some credence.

“Linux the kernel is Linux the kernel. What is an operating system today? If you look at a Linux distribution, that certainly is morphing a little bit,” he replied.

What is running an operating system today? Part of the model going forward, Coekaerts continued, is that instead of installing an OS and installing applications on top, you basically pull in a Docker-like structure.

“The nice thing with that model is you can run different versions on the same machine without having to worry about library conflicts and such,” he said.

Today’s container operations resemble the old mainframe model. On the mainframe, everything was a VM. Every application you started had its own VM.

“We are actually going backward in time, but at a much lighter weight model. It is a similar concept,” Coekearts noted.

Container Tech Responds Rapidly

Container technology is evolving quickly.

“Security is a central focus. As issues surface, developers are dealing with them quickly,” Coekearts said, and the security focus applies to other aspects of the Linux OS too.

“All the Linux developers have been working on these issues,” he noted. “There has been a great communication channel before the disclosure date to make sure that everyone has had time to patch their version or the kernel, and making sure that everyone shares code,” he said. “Is the process perfect? No. But everyone works together.”

Security Black Eye

Vulnerabilities in open source code have been the cause of many recent major security breaches, said Dean Weber, CTO of
Mocana.

Open source components
are present in 96 percent of commercial applications, based on a report Black Duck released last year.

The average application has 147 different open source components — 67 percent of which are used components with known vulnerabilities, according to the report.

“Using vulnerable, open source code in embedded OT (operational technology), IoT (Internet of Things) and ICS (industrial control system) environments is a bad idea for many reasons,” Weber told LinuxInsider.

He cited several examples:

  • The code is not reliable within those devices.
  • Code vulnerabilities easily can be exploited. In OT environments, you don’t always know where the code is in use or if it is up to date.
  • Systems cannot always be patched in the middle of production cycles.

“As the use of insecure open source code continues to grow in OT, IoT and ICS environments, we may see substations going down on the same day, major cities losing power, and sewers backing up into water systems, contaminating our drinking water,” Weber warned.

Good and Bad Coexist

The brutal truth for companies using open source libraries and frameworks is that open source is awesome, generally high-quality, and absolutely the best method for accelerating digital transformation, maintained Jeff Williams, CTO of
Contrast Security.

However, open source comes with a big *but,* he added.

“You are trusting your entire business to code written by people you don’t know for a purpose different than yours, and who may be hostile to you,” Williams told Linuxinsider.

Another downside to open source is that hackers have figured out that it is an easy attack vector. Dozens of new vulnerabilities in open source components are released every week, he noted.

Every business option comes with a bottom line. For open source, the user is responsible for the security of all the open source used.

“It is not a free lunch when you adopt it. You are also taking on the responsibility to think about security, keep it up to date, and establish other protections when necessary,” Williams said.

Best Practices

Developers need an efficient guideline to leverage different deployment models. Software complexity makes it almost impossible for organizations to deliver secure systems. So it is about covering the bases, according to Exit Technologies’ Bittner.

Fundamental practices, such as creating an inventory of open source components, can help devs match known vulnerabilities with installed software. That reduces the threat risk, he said.

“Of course, there is a lot of pressure on dev teams to build more software more quickly, and that has led to increased automation and the rise of DevOps,” Bittner acknowledged. “Businesses have to ensure they don’t cut corners on testing.”

Developers should follow the Unix philosophy of minimalist, modular deployment models, suggested Gravitational’s Ingersoll. The Unix approach involves progressive layering of small tools to form end-to-end continuous integration pipelines. That produces code running in a real target environment without manual intervention.

Another solution for developers is an approach that can standardize with a common build for their specific use that considers third-party dependencies, security and licenses, suggested Bart Copeland, CEO of
ActiveState. Also, best practices for OS deployment models need to consider dependency management and environment configuration.

“This will reduce problems when integrating code from different departments, decrease friction, increase speed, and reduce attack surface area. It will eliminate painful retrofitting open source languages for dependency management, security, licenses and more,” he told LinuxInsider.

Where Is the Open Source Model Headed?

Open source has been becoming more and more enterprise led. That has been accompanied by an increased rise in distributed applications composed from container-based services, such as Kubernetes, according to Copeland.

Application security is at odds with the goals of development: speed, agility and leveraging open source. These two paths need to converge in order to facilitate development and enterprise innovation.

“Open source has won. It is the way everyone — including the U.S. government — now builds applications. Unfortunately, open source remains chronically underfunded,” said Copeland.

That will lead to open source becoming more and more enterprise-led. Enterprises will donate their employee time to creating and maintaining open source.

Open source will continue to dominate the cloud and most server estates, predicted Howard Green, vice president of marketing for
Azul Systems. That influence starts with the Linux OS and extends through much of the data management, monitoring and development stack in enterprises of all sizes.

It is inevitable that open source will continue to grow, said Contrast Security’s Williams. It is inextricably bound with modern software.

“Every website, every API, every desktop application, every mobile app, and every other kind of software almost invariably includes a large amount of open source libraries and frameworks,” he observed. “It is simply unavoidable and would be fiscally imprudent to try to develop all that code yourself.”


Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open source technologies. He has written numerous reviews of Linux distros and other open source software.
Email Jack.





Source link

Linus Torvalds Takes a Break, Apologizes » Linux Magazine


In an unexpected move, Linus Torvalds, the creator of the Linux kernel, is going to take a break from the kernel as he reflects on his behavior on the Linux Kernel Mailing List (LKML).

He made this announcement on LKML, “I am going to take time off and get some assistance on how to understand people’s emotions and respond appropriately.”

Torvalds admitted, “I need to change some of my behavior, and I want to apologize to the people that my personal behavior hurt and possibly drove away from kernel development entirely.”

Although Torvalds is generally very friendly towards users, he is known for using strong language and sometimes insulting comments when discussing technical issues with Linux kernel maintainers and developers.

It’s true that, unlike other managers, Torvalds doesn’t have the power to encourage or discourage his team members by demoting them or taking away their bonuses. His choices are limited. However, his frustration towards his team needs a different kind of venting; personal attacks have proved to be demotivating. Many talented developers have quit the kernel.

The kernel community has been vocal about it and admitted that there is no place for this behavior. It will be interesting to see a changed Torvalds when he returns from his break.

The announcement by Linus accompanied accompanied the release of a newly revamped Code of Conduct to support a positive work environment for all kernel participants.



Source link

NextCloud 14 Arrives » Linux Magazine


Nextcloud Gmbh has announced the release of Nextcloud 14, a fully open source enterprise file sync and storage (EFSS) solution. The new release brings many new features, including an even tighter focus on security.

Unlike its closest competitor Dropbox, Nextcloud is more of a platform than just a sync and storage solution. Nextcloud comes with online collaborative software, secure web chat, secure voice and video conferencing, calendering, contacts, and more.

Now Nextcloud is using a combination of its services to offer tighter security. It’s now using ‘video verification’ for sharing sensitive data. While sending a document, a user can choose to add a ‘Talk’ verification feature (Talk is the name of the video chat service of Nextcloud).

The recipient would have to appear online via video chat and confirm their identity in order for the file to be transferred. The sender would send a password for the file and the receiver would receive the password verbally through the video chat.

Another security-centric feature of Nextcloud 14 is a new 2-factor authentication. The feature allows users to use third party messaging apps like Signal, Telegram and SMS as second factor to secure their authentication.

Hypothetically, Nextcloud can take it to the next level by introducing a 3-factor authentication, by asking the recipient to verify the QR code sent via SMS during the video chat.

Nextcloud 14 is available for free download.



Source link