Monthly Archives: February 2017

How to Securely Transfer Files Between Servers with scp | Linux.com


If you run a live or home server, moving files between local machines or two remote machines is a basic requirement. There are many ways to achieve that. In this article, we talk about scp (secure copy command) that encrypts the transferred file and password so no one can snoop. With scp you don’t have to start an FTP session or log into the system.

The scp tool relies on SSH (Secure Shell) to transfer files, so all you need is the username and password for the source and target systems. Another advantage is that with SCP you can move files between two remote servers, from your local machine in addition to transferring data between local and remote machines. In that case you need usernames and passwords for both servers. Unlike Rsync, you don’t have to log into any of the servers to transfer data from one machine to another.

This tutorial is aimed at new Linux users, so I will keep things as simple as possible. Let’s get started.

Copy a single file from the local machine to a remote machine:

The scp command needs a source and destination to copy files from one location to another location. This is the pattern that we use:

scp localmachine/path_to_the_file username@server_ip:/path_to_remote_directory

In the following example I am copying a local file from my macOS system to my Linux server (Mac OS, being a UNIX operating system has native support for all UNIX/Linux tools).

scp /Volumes/MacDrive/Distros/fedora.iso 
swapnil@10.0.0.75:/media/prim_5/media_server/

Here, ‘swapnil’ is the user on the server and 10.0.0.75 is the server IP. It will ask you to provide the password for that user, and then copy the file securely.

I can do the same from my local Linux machine:

scp /home/swapnil/Downloads/fedora.iso swapnil@10.0.0.75:/media/prim_5/media_server/

If you are running Windows 10, then you can use Ubuntu bash on Windows to copy files from the Windows system to Linux server:

scp /mnt/c/Users/swapnil/Downloads/fedora.iso swapnil@10.0.0.75:/media/prim_5/
  media_server/

Copy a local directory to a remote server:

If you want to copy the entire local directory to the server, then you can add the -r flag to the command:

scp -r localmachine/path_to_the_directory username@server_ip:/path_to_remote_directory/

Make sure that the source directory doesn’t have a forward slash at the end of the path, at the same time the destination path *must* have a forward slash.

Copy all files in a local directory to a remote directory

What if you only want to copy all the files inside a local directory to a remote directory? It’s simply, just add a forward slash and * at the end of source directory and give the path of destination directory. Don’t forget to add the -r flag to the command:

scp -r localmachine/path_to_the_directory/* username@server_ip:/path_to_remote_directory/

Copying files from remote server to local machine

If you want to make a copy of a single file, a directory or all files on the server to the local machine, just follow the same example above, just exchange the place of source and destination.

Copy a single file:

scp username@server_ip:/path_to_remote_directory local_machine/path_to_the_file 

Copy a remote directory to a local machine:

scp -r username@server_ip:/path_to_remote_directory local-machine/path_to_the_directory/

Make sure that the source directory doesn’t have a forward slash at the end of the path, at the same time the destination path *must* have a forward slash.

Copy all files in a remote directory to a local directory:

scp -r username@server_ip:/path_to_remote_directory/* local-machine/path_to_the_directory/ 

Copy files from one directory of the same server to another directory securely from local machine

Usually I ssh into that machine and then use rsync command to perform the job, but with SCP, I can do it easily without having to log into the remote server.

Copy a single file:

scp username@server_ip:/path_to_the_remote_file username@server_ip:/
  path_to_destination_directory/

Copy a directory from one location on remote server to different location on the same server:

scp username@server_ip:/path_to_the_remote_file username@server_ip:/
  path_to_destination_directory/

Copy all files in a remote directory to a local directory

scp -r username@server_ip:/path_to_source_directory/* usdername@server_ip:/
  path_to_the_destination_directory/ 

Copy files from one remote server to another remote server from a local machine

Currently I have to ssh into one server in order to use rsync command to copy files to another server. I can use SCP command to move files between two remote servers:

Usually I ssh into that machine and then use rsync command to perform the job, but with SCP, I can do it easily without having to log into the remote server.

Copy a single file:

scp username@server1_ip:/path_to_the_remote_file username@server2_ip:/
  path_to_destination_directory/

Copy a directory from one location on a remote server to different location on the same server:

scp username@server1_ip:/path_to_the_remote_file username@server2_ip:/
  path_to_destination_directory/

Copy all files in a remote directory to a local directory

scp -r username@server1_ip:/path_to_source_directory/* username@server2_ip:/
  path_to_the_destination_directory/ 

Conclusion

As you can see, once you understand how things work, it will be quite easy to move your files around. That’s what Linux is all about, just invest your time in understanding some basics, then it’s a breeze!

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.

Adapting IT Operations to Emerging Trends: 3 Tips


For infrastructure management professionals, keeping up with new trends is a constant challenge. IT must constantly weigh the potential benefits and risks of adopting new technologies, as well as the pros and cons of continuing to maintain their legacy hardware and applications.

Some experts say that right now is a particularly difficult time for enterprise IT given the massive changes that are occurring. When asked about the trends affecting enterprise IT operations today, Keith Townsend, principal at The CTO Advisor, told me, “Obviously the biggest one is the cloud and the need to integrate cloud.”

In its latest market research, IDC predicts that public cloud services and infrastructure spending will grow 24.4% this year, and Gartner forecasts that the public cloud services market will grow 18%in 2017. By either measure, enterprises are going to be running a lot more of their workloads in the cloud, which means IT operations will need to adapt to deal with this new situation.

Townsend, who also is SAP infrastructure architect at AbbVie, said that the growth in hybrid cloud computing and new advancements like serverless computing and containers pose challenges for IT operations, given “the resulting need for automation and orchestration throughout the enterprise IT infrastructure.” He added, “Ultimately, they need to transform their organizations from a people, process and technology perspective.”

For organizations seeking to accomplish that transformation, Townsend offered three key pieces of advice.

Put the strategy first

Townsend said the biggest mistake he sees enterprises making “is investing in tools before they really understand their strategy.” Organizations know that their approach to IT needs to change, but they don’t always clearly define their goals and objectives.

Instead, Townsend said, they often start by “going out to vendors and asking vendors to solve this problem for them in the form of some tool or dashboard or some framework without understanding what the drivers are internally.”

IT operations groups can save themselves a great deal of time, money and aggravation by focusing on their strategy first before they invest in new tools.

Self-fund your transformation

Attaining the level of agility and flexibility that allows organizations to take advantage of the latest advances in cloud computing isn’t easy or cheap. “That requires some investment, but it’s tough to get that investment,” Townsend acknowledged.

Instead of asking for budget increases, he believes the best way to do that investment is through self-funding.

Most IT teams spend about 80% of their budgets on maintaining existing systems, activities that are colloquially called “keeping the lights on.” That leaves only 20% of the budget for new projects and transformation. “That mix needs to be changed,” said Townsend.

He recommends that organizations look for ways to become more efficient. By carefully deploying automation and adopting new processes, teams can accomplish a “series of mini-transformations” that gradually decreases the amount of money that must be spent on maintenance and frees up more funds and staff resources for new projects.

Focus on agility, not services

In his work, Townsend has seen many IT teams often make a common mistake when it comes to dealing with the business side of the organization: not paying enough attention to what is happening in the business and what the business really wants.

When the business comes to IT with a request, IT typically responds with a list of limited options. Townsend said that these limited options are the equivalent of telling the business no. “What they are asking for is agility,” he said.

He told a story about a recent six-month infrastructure project where the business objectives for the project completely changed between the beginning of the project and the end. An IT organization can only adapt to those sort of constant changes by adopting a DevOps approach, he said. If IT wants to remain relevant and help organizations capitalize on the new opportunities that the cloud offers, it has to become much more agile and flexible.

You can see Keith Townsend live and in person at Interop ITX, where he will offer more insight about how enterprise IT needs to transform itself in his session, “Holistic IT Operations in the Application Age.” Register now for Interop ITX, May 15-19, in Las Vegas.



Source link

Understanding the Difference Between sudo and su | Linux.com


In one of our earlier articles, we discussed the ‘sudo’ command in detail. Towards the ends of that tutorial, there was a mention of another similar command ‘su’ in a small note. 

In this article, we will discuss in detail the ‘su’ command as well as how it differs from the ‘sudo’ command. The main work of the su command is to let you switch to some other user during a login session. In other words, the tool lets you assume the identity of some other user without having to logout and then login (as that user).

Read more at HowtoForge

Click Here!

Create A Caption, Win A Free Interop ITX Pass


Penning a clever cartoon caption could be your ticket to Interop ITX in May.

 

With spring approaching, we here at Network Computing are counting down to Interop ITX, our partner event. Held May 15-19 in Las Vegas, Interop ITX will feature more than 130 sessions across six tracks, intensive workshops, keynotes from IT luminaries, and plenty of networking opportunities. Sporting a new name, this year’s conference is shaping up to be the best ever.

So you definitely don’t want to miss out! And if you’re very clever, you can win a conference pass for the entire week of IT education. Our cartoonist created an illustration that’s in need of a caption. Let your creativity flow, and submit your caption in the comments section below.

Our panel of Network Computing editors and Interop experts will choose the lucky winner from the submissions. If you don’t want to enter a caption, you can help us pick a winner by voting on the submissions. You can vote for your favorites by clicking on the “thumbs up” icon directly below the comment.

The contest closes March 23. We’ll announce the winner on March 30 and provide the champion with a code to register for Interop ITX. Please note that travel and expenses are not included with the All Access Pass prize. Click here for additional contest rules.

Now it’s time to be ingenious and have fun creating those captions!



Source link

How to Install pandom: A True Random Number Generator for Linux | Linux.com


This tutorial explains how to install pandom: a timing jitter true random number generator maintained by ncomputers.org. The built-in Linux kernel true random number generator provides low throughput under modern circumstances, as for example: personal computers with solid state drives (SSD) and virtual private servers (VPS). This problem is becoming popular in Linux implementations, because of the continuously increasing need for true random numbers, mainly by diverse cryptographic purposes.

This tutorial is for amd64 / x86_64 linux kernel versions greater and equal to 2.6.9. It explains how to install pandom: a timing jitter true random number generator maintained by ncomputers.org.

Read more at HowToForge

Click Here!