Linux and UNIX How To: Scripting SSH and SFTP

Thursday Dec 11th 2008 by Jeremy M. Jones
Share:

Learn how to establish secure, automated SSH connections and SFTP file transfers with Python and paramiko.

SSH is an indispensable tool that I use every day for file transfers, remote execution of tasks, setting up network port redirection between systems (tunneling), and securely driving a shell on a remote system. While the SSH commandline client on UNIX and Linux systems is how I interact most often with SSH servers on the remote end, there are times when it is helpful to script some action or series of actions rather than performing them interactively.

This is where Python and paramiko come in. paramiko is a library for Python that provides a programmatic interface to SSH. This combination of Python and SSH allows you to drive SSH tasks you would normally perform manually.

Installation of paramiko is pretty simple. If you're using Ubuntu or Debian, installation is a simple "sudo apt-get install python-paramiko" command in a terminal. If you aren't using Ubuntu or Debian, you can check to see if your system provides a package for paramiko. If it doesn't, then you can always use easy_install: "sudo easy_install paramiko".

After installing, the next step is to create an SSHClient object and connect it to an SSH server. I've created a module named "ssh_common.py" that contains a function to connect to a specific server on my network and then return an "ssh" object.


#!/usr/bin/env python

import paramiko
from contextlib import contextmanager

host = '192.168.10.142'
username = 'slacker'
password = 'insecure'

@contextmanager
def create_ssh(host=host, username=username, password=password):
    ssh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())

    try:
        print "creating connection"
        ssh.connect(host, username=username, password=password)
        print "connected"
        yield ssh
    finally:
        print "closing connection"
        ssh.close()
        print "closed"

The first couple of lines are imports: one for "paramiko" and another for "contextlib". "paramiko" contains all of the SSH code that we'll use. "contextlib" contains code that helps setup a context manager in a very simple way. Context managers are a fairly new feature to Python. I won't go into great detail in describing them here, but the gist of a context manager is that it allows you to create some resource that you can use in a "with" code block. It performs some initialization code before the "with" block is entered and then runs some finalization code after the "with" block is completed. I am using a context manager with an SSH connection because I know that before I use the connection, I need it to login to the server and after I get done with it, I want it to close.

After the imports, I set "host", "username", and "password" variables that the "create_ssh()" function will pick up by default. While this function will pick these values up by default, you can certainly override one or all of them if you import the module and call the function yourself.

Next, I define the function "create_ssh()" that I just mentioned. I set a "contextmanager" decorator on "create_ssh()" so that it can be used as a context manager. Inside "create_ssh()", I create an "SSHClient" object and refer to it as "ssh". I then set a host key policy to automatically accept host keys from any host it connects to. If you are concerned with security, you'll want to change this behavior so that it is not quite so permissive.

Next, I create a try/finally block. In the try/finally block, I connect to the specified server and then login. After connecting, I "yield" the "ssh" object. A context manager executes the code up to the point of "yield" before anything else in the "with" block. (I'll get to an example of the "with" block in the next code example.)

In the "finally" block, I simply "close()" the connection. A context manager executes the code after the "yield" after everything else in the "with" block has run. In this case, it will execute the "finally" block even if an exception is raised when the "ssh" object is used. Note that before and after creating the connection and closing it, I have placed "print" statements to detail what is going on in the code.

Here is a very simple example that creates a "with" block and uses the "ssh" object:


#!/usr/bin/env python

from __future__ import with_statement
import ssh_common

with ssh_common.create_ssh() as ssh:
    print ssh

All that I do in this example is to get an "ssh" object from the "create_ssh()" function in the "with" statement and then print the "ssh" object.

Here is the output from running this simple code:


jmjones: python_ssh$ python simple_connect.py
creating connection
connected
<paramiko.SSHClient object at 0xdcaf90>
closing connection
closed

You can see that the pre- and post- "yield" statements ran before and after the print statement that displayed the "ssh" object. This means that the "ssh" object was initialized before any code ran in the "with" block and it was finalized after the code ran inside the "with" block.

Continued from Page 1.

Now, we can get to something that is a little more useful. Sometimes you may want to execute a command on a remote system and gather the output from the command. The following example executes an "ls" command on the remote server that we connected to in the previous examples.


#!/usr/bin/env python

from __future__ import with_statement
import ssh_common

with ssh_common.create_ssh() as ssh:
    cmd = 'ls -l /home/slacker/files'
    stdin, stdout, stderr = ssh.exec_command(cmd)

    print "STDOUT"

    print "+" * 40
    print stdout.read()
    print "-" * 40

    print "STDERR"
    print "+" * 40
    print stderr.read()
    print "-" * 40


The only thing this example does differently with the "ssh" object from the previous example is that it calls the "exec_command()" method on it. The command I passed in was "ls -l /home/slacker/files". "exec_command()" returns a tuple containing the standard input, output, and error of the remote process. The rest of the code prints the standard output and standard error from the remote execution. Here is what happens when I run the script:


jmjones: python_ssh$ python simple_exec.py
creating connection
connected
STDOUT
++++++++++++++++++++++++++++++++++++++++
total 48
-rw-rw-r-- 1 slacker slacker 5855 Nov 19 19:15 a.txt
-rw-rw-r-- 1 slacker slacker 5855 Nov 19 19:15 b.txt
-rw-rw-r-- 1 slacker slacker 5855 Nov 19 19:15 c.txt
-rw-rw-r-- 1 slacker slacker 5855 Nov 19 19:15 d.txt

----------------------------------------
STDERR
++++++++++++++++++++++++++++++++++++++++

----------------------------------------
closing connection
closed

The script displayed the listing of four files: "a.txt", "b.txt", "c.txt", and "d.txt". The file listing came from stdout and nothing came from stderr. I wrote this script so that it would do a file listing in order to tie into the next examples, but you could replace that "ls" command with anything: a "ps" command to check for running processes, a "netstat" command to see if any process is binding a specific port, or a "df" command to check free disk space.

Now, on to something else useful. I commonly use SSH to transfer files between servers using SSH. Actually, I use SFTP or SCP, but these are parts of SSH. The following module contains a function that takes an "ssh" object and returns an "sftp" object wrapped in context manager goodness.


#!/usr/bin/env python

from contextlib import contextmanager

@contextmanager
def create_sftp(ssh):
    try:
        print "creating sftp"
        sftp = ssh.open_sftp()
        print "created"

        yield sftp
    finally:
        print "closing sftp"
        sftp.close()
        print "sftp closed"

And here is a script that uses the "ssh" and "sftp" context managers and retrieves the files in "/home/slacker/files".


#!/usr/bin/env python

from __future__ import with_statement
import ssh_common
import sftp_common
import os

with ssh_common.create_ssh() as ssh:
    with sftp_common.create_sftp(ssh) as sftp:
        sftp.chdir('/home/slacker/files')
        for f in  sftp.listdir():
            print "retrieving", f
            sftp.get(f, os.path.join('files', f))


This script is pretty straightforward. At the beginning of the outer "with" block, the script creates an "ssh" object. At the beginning of the inner "with" block, the script creates an "sftp" object. Inside the inner "with" block, the script changes directory to "/home/slacker/files", then does a directory listing and retrieves each of the files listed. When the file retrieval part is finished, the context managers execute the "finalization" code, closing the "sftp" and "ssh" connections.

Here is the result of running the "sftp" script:


jmjones: python_ssh$ python simple_xfer.py
creating connection
connected
creating sftp
created
retrieving d.txt
retrieving c.txt
retrieving a.txt
retrieving b.txt
closing sftp
sftp closed
closing connection
closed

paramiko makes the possibilities for SSH automation truly mind boggling. paramiko also simplifies connecting to an SSH server from some Python code you have already written. If you have to interact with SSH servers on a regular basis, download paramiko and play around with it. It's bound to simplify your work.

This article was first published on EnterpriseITPlanet.com.

Share:
Mobile Site | Full Site
Copyright 2017 © QuinStreet Inc. All Rights Reserved