Monthly Archives: November 2018

Ruby in Containers | Linux.com


There was a time when deploying software was an event, a ceremony because of the difficulty that was required to keep this consistency. Teams spent a lot of time making the destination environments run the software as the source environment. They thereafter prayed that the gods kept the software running perfectly in production as in development.

With containers, deployments are more frequent because we package our applications with their libraries as a unit making them portable thereby helping us maintain consistency and reliability when moving software between environments. For developers, this is improved productivity, portability and ease of scaling.

Because of this portability, containers have become the universal language of the cloud allowing us to move software from one cloud to another without much trouble.

In this article, I will discuss two major concepts to note while working with containers in Ruby. I will discuss how to create small container images and how to test them.

Read more at The New Stack

The Spectre/Meltdown Performance Impact On Linux 4.20, Decimating Benchmarks With New STIBP Overhead


As outlined yesterday, significant slowdowns with the Linux 4.20 kernel turned out to be due to the addition of the kernel-side bits for STIBP (Single Thread Indirect Branch Predictors) for cross-HyperThread Spectre Variant Two mitigation. This has incurred significant performance penalties with the STIBP support in its current state with Linux 4.20 Git and is enabled by default at least for Intel systems with up-to-date microcode. Here are some follow-up benchmarks looking at the performance hit with the Linux 4.20 development kernel as well as the overall Spectre and Meltdown mitigation impact on this latest version of the Linux kernel.

Some users have said AMD also needs STIBP, but at least with Linux 4.20 Git and the AMD systems I have tested with their up-to-date BIOS/microcode, that hasn’t appeared to be the case. Most of the AMD STIBP references date back to January when Spectre/Meltdown first came to light. We’ll see in the week ahead if there is any comment from AMD but at this time seems to be affecting up-to-date Intel systems with the Linux 4.20 kernel.

One of the most common request since yesterday’s article when bisecting it down to STIBP as the cause for the Linux 4.20 performance drop, many Phoronix readers were curious to know the overall performance cost of all the Spectre / Meltdown mitigations that have come about so far this year. I happened to do some tests on the latest Linux 4.20 Git both in its default mitigated state “KPTI + __user pointer sanitization + Full generic retpoline IBPB IBRS_FW STIBP RSB filling + SSB disabled via prctl and seccomp + PTE Inversion; VMX: conditional cache flushes SMT vulnerable” and then again when disabling the mitigations permitted at run-time.

The comparison of the overall Spectre/Meltdown cost on Linux 4.20 was done with the Intel Core i9 7980XE.

Following that comparison are some tests of the dual Intel Xeon Gold server with Linux 4.19, Linux 4.20, and then Linux 4.20 with no mitigations enabled. Those results are compared to the current AMD EPYC performance for seeing how the introduction of STIBP affects that positioning.

All of these benchmarks were facilitated in a fully-automated and reproducible manner using the open-source Phoronix Test Suite benchmarking software.


5 Easy Tips for Linux Web Browser Security | Linux.com


If you use your Linux desktop and never open a web browser, you are a special kind of user. For most of us, however, a web browser has become one of the most-used digital tools on the planet. We work, we play, we get news, we interact, we bank… the number of things we do via a web browser far exceeds what we do in local applications. Because of that, we need to be cognizant of how we work with web browsers, and do so with a nod to security. Why? Because there will always be nefarious sites and people, attempting to steal information. Considering the sensitive nature of the information we send through our web browsers, it should be obvious why security is of utmost importance.

So, what is a user to do? In this article, I’ll offer a few basic tips, for users of all sorts, to help decrease the chances that your data will end up in the hands of the wrong people. I will be demonstrating on the Firefox web browser, but many of these tips cross the application threshold and can be applied to any flavor of web browser.

1. Choose Your Browser Wisely

Although most of these tips apply to most browsers, it is imperative that you select your web browser wisely. One of the more important aspects of browser security is the frequency of updates. New issues are discovered quite frequently and you need to have a web browser that is as up to date as possible. Of major browsers, here is how they rank with updates released in 2017:

  1. Chrome released 8 updates (with Chromium following up with numerous security patches throughout the year).

  2. Firefox released 7 updates.

  3. Edge released 2 updates.

  4. Safari released 1 update (although Apple does release 5-6 security patches yearly).

But even if your browser of choice releases an update every month, if you (as a user) don’t upgrade, that update does you no good. This can be problematic with certain Linux distributions. Although many of the more popular flavors of Linux do a good job of keeping web browsers up to date, others do not. So, it’s crucial that you manually keep on top of browser updates. This might mean your distribution of choice doesn’t include the latest version of your web browser of choice in its standard repository. If that’s the case, you can always manually download the latest version of the browser from the developer’s download page and install from there.

If you like to live on the edge, you can always use a beta or daily build version of your browser. Do note, that using a daily build or beta version does come with it the possibility of unstable software. Say, however, you’re okay with using a daily build of Firefox on a Ubuntu-based distribution. To do that, add the necessary repository with the command:

sudo apt-add-repository ppa:ubuntu-mozilla-daily/ppa

Update apt and install the daily Firefox with the commands:

sudo apt-get update

sudo apt-get install firefox

What’s most important here is to never allow your browser to get far out of date. You want to have the most updated version possible on your desktop. Period. If you fail this one thing, you could be using a browser that is vulnerable to numerous issues.

2. Use A Private Window

Now that you have your browser updated, how do you best make use of it? If you happen to be of the really concerned type, you should consider always using a private window. Why? Private browser windows don’t retain your data: No passwords, no cookies, no cache, no history… nothing. The one caveat to browsing through a private window is that (as you probably expect), every time you go back to a web site, or use a service, you’ll have to re-type any credentials to log in. If you’re serious about browser security, never saving credentials should be your default behavior.

This leads me to a reminder that everyone needs: Make your passwords strong! In fact, at this point in the game, everyone should be using a password manager to store very strong passwords. My password manager of choice is Universal Password Manager.

3. Protect Your Passwords

For some, having to retype those passwords every single time might be too much. So what do you do if you want to protect those passwords, while not having to type them constantly? If you use Firefox, there’s a built-in tool, called Master Password. With this enabled, none of your browser’s saved passwords are accessible, until you correctly type the master password. To set this up, do the following:

  1. Open Firefox.

  2. Click the menu button.

  3. Click Preferences.

  4. In the Preferences window, click Privacy & Security.

  5. In the resulting window, click the checkbox for Use a master password (Figure 1).

  6. When prompted, type and verify your new master password (Figure 2).

  7. Close and reopen Firefox.

4. Know your Extensions

There are plenty of privacy-focused extensions available for most browsers. What extensions you use will depend upon what you want to focus on. For myself, I choose the following extensions for Firefox:

  • Firefox Multi-Account Containers – Allows you to configure certain sites to open in a containerized tab.

  • Facebook Container – Always opens Facebook in a containerized tab (Firefox Multi-Account Containers is required for this).

  • Avast Online Security – Identifies and blocks known phishing sites and displays a website’s security rating (curated by the Avast community of over 400 million users).

  • Mining Blocker – Blocks all CPU-Crypto Miners before they are loaded.

  • PassFF – Integrates with pass (A UNIX password manager) to store credentials safely.

  • Privacy Badger – Automatically learns to block trackers.

  • uBlock Origin – Blocks trackers based on known lists.

Of course, you’ll find plenty more security-focused extensions for:

Not every web browser offers extensions. Some, such as Midoria, offer a limited about of built-in plugins, that can be enabled/disabled (Figure 3). However, you won’t find third-party plugins available for the majority of these lightweight browsers.

5. Virtualize

For those that are concerned about releasing locally stored data to prying eyes, one option would be to only use a browser on a virtual machine. To do this, install the likes of VirtualBox, install a Linux guest, and then run whatever browser you like in the virtual environment. If you then apply the above tips, you can be sure your browsing experience will be safe.

The Truth of the Matter

The truth is, if the machine you are working from is on a network, you’re never going to be 100% safe. However, if you use that web browser intelligently you’ll get more bang out of your security buck and be less prone to having data stolen. The silver lining with Linux is that the chances of getting malicious software installed on your machine is exponentially less than if you were using another platform. Just remember to always use the latest release of your browser, keep your operating system updated, and use caution with the sites you visit.

Acumos Project’s 1st Software, Athena, Helps Ease AI Deployment | Software


By Jack M. Germain

Nov 16, 2018 5:00 AM PT

The
LF Deep Learning Foundation on Wednesday announced the availability of the first software from the
Acumos AI Project. Dubbed “Athena,” it supports open source innovation in artificial intelligence, machine learning and deep learning.

This is the first software release from the Acumos AI Project since its launch earlier this year. The goal is to make critical new technologies available to developers and data scientists everywhere.

Acumos is part of a Linux Foundation umbrella organization, the LF Deep Learning Foundation, that supports and sustains open source innovation in artificial intelligence, machine learning and deep learning. Acumos is based in Shanghai.

Acumos AI is a platform and open source framework that makes it easy to build, share and deploy AI apps. Acumos standardizes the infrastructure stack and components required to run an out-of-the-box general AI environment, freeing data scientists and model trainers to focus on their core competencies, and accelerating innovation.

“The Acumos Athena release represents a significant step forward in making AI models more accessible for builders of AI applications and models, along with users and trainers of those models and applications,” said Scott Nicholas, senior director of strategic planning at The Linux Foundation. “This furthers the goal of LF Deep Learning and the Acumos project of accelerating overall AI innovation.”

The challenge with AI is that there are very few apps to use it, noted Jay Srivatsa, CEO of
Future Wealth.

“Acumos was launched to create an AI marketplace, and the release of Athena is a first step in that direction,” he told LinuxInsider.

The Acumos AI Platform

Acumos packages toolkits such as TensorFlow and SciKit Learn, along with models that have a common API that allows them to connect seamlessly. The AI platform allows for easy onboarding and training of models and tools.

The platform supports a variety of popular software languages, including Java, Python, and R. The R language is a free software environment for statistical computing and graphics.

The Acumos AI Platform leverages modern microservices and containers to package and export production-ready AI applications as Docker files. It includes a federated AI Model Marketplace, which is a catalog of community-distributed AI models that can be shared securely.

LF Deep Learning members contribute to the evolution of the platform to ease the onboarding and the deployment of AI models, according to LF Deep Learning Outreach Committee Chair Jamil Chawki. The Acumos AI Marketplace is open and accessible to anyone who wants to download or contribute models and applications.

“Acumos Athena is a significant release because it enables the interoperability of AI, DL and ML models and prevents the lock-in that usually occurs whenever projects are built using disparate configurations, systems and deployment techniques,” explained Rishi Bhargava, cofounder of
Demisto.

It will ease restrictions on AI, DL and ML developers by removing silos and allowing them to build standardized models, chain each other’s models together, and refine them through an out-of-the-box general AI environment, he told LinuxInsider.

“The efficiency of learning models is hugely contingent on the quality and uniqueness of data, the depth and repeatability of feature engineering, and selecting the best model for the task at hand,” Bhargava said. “Athena will free developers of extraneous burdens so they can focus on these core tasks, learn from each other, and eventually deliver better models to businesses and customers.”

Athena Release Highlights

Athena’s design is packed with features to make the software quick and easy to deploy, and to make it easy to share Acumos AI applications.

Athena can be deployed with one-click using Docker or Kubernetes. The software also can deploy models into a public or private cloud infrastructure, or into a Kubernetes environment on users’ own hardware, including servers and virtual machines.

It utilizes a design studio graphical interface that enables chaining together multiple models, data translation tools, filters and output adapters into a full end-to-end solution. Also at play is a security token to allow simple onboarding of models from an external toolkit directly to an Acumos AI repository.

Models easily can be repurposed for different environments and hardware. This is done by decoupling microservices generation from the model onboarding process.

An advanced user portal allows personalization of marketplace view by theme and data on model authorship. This portal also allows users to share models privately or publicly.

“The LF Deep Learning Foundation is focused on building an ecosystem of AI, deep learning and machine learning projects, and today’s announcement represents a significant milestone toward achieving this vision,” said LF Deep Learning Technical Advisory Council Chair Ofer Hermoni of Amdocs.

Unifying Factor

The Acumos release is significant for the advancement of AI, DL and ML innovation, according to Edgar Radjabli, managing partner of
Apis Capital Management.

The AI industry is very fragmented, with virtually no standardization.

“Companies building technology are usually required to write most from scratch or pay for expensive licensed cloud AI solutions,” Radjabli told LinuxInsider. “Acumos can help bring a base (protocol) layer standard to the industry, in the same way that HTTP did for the Internet and Linux itself did for application development.”

LF Deep Learning members are inspired and energized by the progress of the Acumos AI Project, noted Mazin Gilbert, vice president of advanced technology and systems at AT&T and the governing board chair of LF Deep Learning.

“Athena is the next step in harmonizing the AI community, furthering adoption and accelerating innovation,” he said.

Open Source More Suitable

Given the challenges of growing new technologies, open source models are better suited to the development process than those of commercial software firms. Open source base layer software is ideal. It allows greater adoption and interoperability between diverse projects from established players and startups, said Radjabli.

“I believe that Acumos will be used both by other open source projects building second-layer applications, as well as commercial applications,” he said.

Today, the same situation exists in other software development. Open source base layer protocols are used across the industry, both by other open source/nonprofit projects and commercial operations, he explained.

“Athena clearly is geared to an open source environment, given that it already has about 70 or more contributors,” said Future Wealth’s Srivats.

Benefits for Business and Consumers

The benefits to be gained from AI, DL and ML are very significant. Companies across the industry have been making progress in the development of unique applications for AI/DL/MO. More growth in this space will result from Acumos, according to Radjabli.

One example involves a company that uses neural networks for predictive healthcare analytics. This system allows it to diagnose breast cancer with zero percent false negatives simply from patient data correlation analysis. This does not involve any invasive testing or imaging, according to Radjabli.

“The correlation is comprised of over 40 variables, which means it would have never been found through traditional medical research data analysis and was only made possible through the use of convolutional and recurrent neural networks working in combination,” he said.

AI, DL and ML are all geared toward businesses understanding and predicting consumer behavior, added Srivatsa.

“Both will benefit,” he said.

What’s Next for Acumos AI

The developer community for Acumos AI already is working on the next release. The company expects it to be available in mid-2019.

The next release will introduce convenient model training, as well as data extraction pipelines to make models more flexible.

Additionally, the next release will include updates to assist closed-source model developers, such as secure and reliable licensing components to provide execution control and performance feedback across the community.


Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open source technologies. He has written numerous reviews of Linux distros and other open source software.
Email Jack.





Source link

GNOME 3.31.2 Desktop Released – Phoronix


GNOME --

GNOME 3.31.2 is out this Friday as the latest development release in the trek towards next March’s GNOME 3.32 release.

Highlights for the GNOME 3.31.2 development milestone include:

– The Epiphany web-browser has added preview widgets to its file choosers.

– Support for XPS files within the Flatpak version of the Evince document viewer. Meson is also now the default build system for the Flatpak version of Evince.

– GNOME Boxes virtualization client now sets the default machine type to the Intel Q35 model.

– Crash fixes for the Nautilus file manager.

– Sushi has been ported to the Meson build system.

– Various application icons were updated.

– Performance work and fixes for GNOME Shell and Mutter.

More details on the GNOME 3.31.2 development release via the mailing list announcement.