Deployment automation via SSH With python fabric: How It lick

python automation


Deployment
  • automation

  • via SSH With Python Fabric: How It Works 339 reads @ agzuniverse Aswin guanine
  • hundred

  • S undergrad student with a love for tech. carbon
    urrently doing things with
  • Java



  • Script, Python and
  • go

  • lang. One of the most basic ways in which a project gets deployed is by SSHing into a remote host followed by executing a few basic commands. Apart from deployment, this can also be useful for running any command you want on a remote host, such as when a
  • hundred

  • I/hundred
    D pipeline is triggered. In this article I'll be taking a look on how to deploy a basic project to a remote server through Gitlab CI using Python Fabric. What is Fabric? Fabric is an open source python library that is used to execute commands remotely over SSH.

    Fabric is compatible with Python 2 and 3. To consumption it with Python3, install
    it with:


    pip3 install fabric




    Since the goal goal here is to manipulation fabric in an automated deployment pipeline, you don't actually motivation to have fabric install on your local car (unless that is where your pipeline footrace).


    Installing fabric also installs the fab

    create a fabfile


    binary stub, which essentially allows fabric to read and execute commands defined in a file called the fabfile. Create a file named fabfile.py from fabric significance
    task @task def deploy (ctx) : print( "inside the task!"
    ) and start out with the following code: This defines a task called "deploy" which can be passed as an argument to the fab binary for execution. The @task deploy ctx decorator (Which is the Python equivalent of a closure - a subroutine that takes another subroutine as a parameter or returns another subroutine.) is used to convert thefunction to a task that can be executed by the fab binary. The function must take a context argument, which is given asin this case. On executing this by running fab deploy fabfile.py from the same directory as, "Inside the task!" will be printed as the output.

    nowadays to habit this to do what Fabric is mean for - connect a distant host and execute command in it.


    from fabric meaning

    Connection, task @task def deploy (ctx) : with Connection( "host"
    ) as c: with c.cd( "/home/project/path/" ): c.run( "docker-compose down" ) c.run( "git pull origin maestro --recurse-submodules --rebase"
    ) c.run( "docker-compose up --physique -d"
    ) Here, replace
    host

    c.cd() c.run() with the name or IP address of the host to which you are establishing a connection. From there, you can execute any command that the user as which you are logged in has permission to do.is used to ensure the remaining commands are executed from that particular folder. The commands I have given after this withare just examples. Replace them with what you need to execute. In this example I'm making use of Python context managers with the help of with blocks, but there are many other ways to do this as well. You can read up on the Fabric documentation to explore other ways to do this. By now you might have noticed just the host name is often not enough to establish a connection. What about the username and the private key to establish an SSH connection? This isn't immediately obvious, but because Fabric uses a library called Paramiko under the hood for handling the SSH side of things, we can pass arguments to Paramiko's SSHclient.connect connect_kwargs from fabric significance
    Connection, task @task def deploy (ctx) : with Connection( "host
    " , user= "USERNAME" , connect_kwargs={ "key_filename" : "~/.ssh/your_key" } ) as c: with c.cd( "/home/project/path/" ): c.run( "docker-compose down" ) c.run( "git puff beginning victor --recurse-submodules --rebase"
    ) c.run( "docker-compose up --construct -d"
    ) method by using theargument in Fabric's Connection. This looks like the following: Paramiko's SSHclient.connect key_filename takes aargument, which specifies the filename of the key to be used. Now you are passing all the info required to establish a connection via SSH. You can also pass the SSH key as an instance of Paramiko's pkey.Pkey class, or make use of a bunch of other options, which you can find in Paramiko's documentation Of course, it's much safer to load all of these as environment variables. When using Gitlab CI, while setting the environment variables you can choose to make the key available as a file instead of a string, which is needed to make it work with the key_filename argument.

    here is a screenshot of the variable part under Gitlab CI/CD context with the hostname add as a string and the samara add as a file.


    You might need to cargo these value in other way depending on how your team is organize and as per your security prerequisite.


    With these environment variable in home, the fabfile change to this:


    meaning
    os from fabric import Connection, task @task def deploy (ctx) : with Connection( os.environ[ "host"
    ], user= "USERNAME" , connect_kwargs={ "key_filename" : os.environ[ "DEPLOY_KEY_FILE" ]}, ) as c: with c.cd( "/home/project/path/" ): c.run( "docker-compose down" ) c.run( "git pluck lineage maestro --recurse-submodules --rebase"
    ) c.run( "docker-compose up --construct -d"
    )

    context up Gitlab CI


    nowadays that the fabfile is cook, execution it with the Python Fabric package is all we have to do. This can be perform in any means you privation - manually, exploitation Github action, etc. here iodine have an example of track it with Gitlab's CI.


    To do this, create a .gitlab-ci.yml trope:
    "python:3.6" degree:
    - deploy deploy_to_production: stage: deploy script: - pip3 install fabric - fab deploy only: - overlord
    file at the project root. An example is given below: This is a basic configuration file that should be easy to understand if you're familiar with Gitlab's CI. It just has one job called deploy_to_production pip3 install fabric fab deploy fabfile.py that executes two things (It uses a Python 3 image as base):to install Fabric, and, which makes Fabric read ourand execute the task named "deploy" in it.

    ending


    Fabric has quite a lot more options than what is outlined here - such as dealing with sending the sudo password to the remote host when it's required, dealing with multiple hosts etc. It's worth going through their documentation if you're planning to make use of it.

    indiana ending, this is a very childliketon and effective means to programmatically build script to be execute over SSH and trip them from coarse methods such as through a CI/CD joyride.


    I hope you found this post useful. You can find me on Twitter and LinkedIn. @ agzuniverse CS undergrad student with a love for tech. Currently doing things with JavaScript, Python and Adam
    lang. by Aswin G Check out more of my work

    tag


    1 Comments

    Previous Post Next Post