One place for hosting & domains

      Testing

      How To Add Unit Testing to Your Django Project


      The author selected the Open Internet/Free Speech Fund to receive a donation as part of the Write for DOnations program.

      Introduction

      It is nearly impossible to build websites that work perfectly the first time without errors. For that reason, you need to test your web application to find these errors and work on them proactively. In order to improve the efficiency of tests, it is common to break down testing into units that test specific functionalities of the web application. This practice is called unit testing. It makes it easier to detect errors because the tests focus on small parts (units) of your project independently from other parts.

      Testing a website can be a complex task to undertake because it is made up of several layers of logic like handling HTTP requests, form validation, and rendering templates. However Django provides a set of tools that makes testing your web application seamless. In Django, the preferred way to write tests is to use the Python unittest module, although it is possible to use other testing frameworks.

      In this tutorial, you will set up a test suite in your Django project and write unit tests for the models and views in your application. You will run these tests, analyze their results, and learn how to find the causes of failing tests.

      Prerequisites

      Before beginning this tutorial, you’ll need the following:

      Step 1 — Adding a Test Suite to Your Django Application

      A test suite in Django is a collection of all the test cases in all the apps in your project. To make it possible for the Django testing utility to discover the test cases you have, you write the test cases in scripts whose names begin with test. In this step, you’ll create the directory structure and files for your test suite, and create an empty test case in it.

      If you followed the Django Development tutorial series, you’ll have a Django app called blogsite.

      Let’s create a folder to hold all our testing scripts. First, activate the virtual environment:

      • cd ~/my_blog_app
      • . env/bin/activate

      Then navigate to the blogsite app directory, the folder that contains the models.py and views.py files, and then create a new folder called tests:

      • cd ~/my_blog_app/blog/blogsite
      • mkdir tests

      Next, you’ll turn this folder into a Python package, so add an __init__.py file:

      • cd ~/my_blog_app/blog/blogsite/tests
      • touch __init__.py

      You’ll now add a file for testing your models and another for testing your views:

      • touch test_models.py
      • touch test_views.py

      Finally, you will create an empty test case in test_models.py. You will need to import the Django TestCase class and make it a super class of your own test case class. Later on, you will add methods to this test case to test the logic in your models. Open the file test_models.py:

      Now add the following code to the file:

      ~/my_blog_app/blog/blogsite/tests/test_models.py

      from django.test import TestCase
      
      class ModelsTestCase(TestCase):
          pass
      

      You’ve now successfully added a test suite to the blogsite app. Next, you will fill out the details of the empty model test case you created here.

      Step 2 — Testing Your Python Code

      In this step, you will test the logic of the code written in the models.py file. In particular, you will be testing the save method of the Post model to ensure it creates the correct slug of a post’s title when called.

      Let’s begin by looking at the code you already have in your models.py file for the save method of the Post model:

      • cd ~/my_blog_app/blog/blogsite
      • nano models.py

      You’ll see the following:

      ~/my_blog_app/blog/blogsite/models.py

      class Post(models.Model):
          ...
          def save(self, *args, **kwargs):
              if not self.slug:
                  self.slug = slugify(self.title)
              super(Post, self).save(*args, **kwargs)
          ...
      

      We can see that it checks whether the post about to be saved has a slug value, and if not, calls slugify to create a slug value for it. This is the type of logic you might want to test to ensure that slugs are actually created when saving a post.

      Close the file.

      To test this, go back to test_models.py:

      Then update it to the following, adding in the highlighted portions:

      ~/my_blog_app/blog/blogsite/tests/test_models.py

      from django.test import TestCase
      from django.template.defaultfilters import slugify
      from blogsite.models import Post
      
      
      class ModelsTestCase(TestCase):
          def test_post_has_slug(self):
              """Posts are given slugs correctly when saving"""
              post = Post.objects.create(title="My first post")
      
              post.author = "John Doe"
              post.save()
              self.assertEqual(post.slug, slugify(post.title))
      

      This new method test_post_has_slug creates a new post with the title "My first post" and then gives the post an author and saves the post. After this, using the assertEqual method from the Python unittest module, it checks whether the slug for the post is correct. The assertEqual method checks whether the two arguments passed to it are equal as determined by the "==" operator and raises an error if they are not.

      Save and exit test_models.py.

      This is an example of what can be tested. The more logic you add to your project, the more there is to test. If you add more logic to the save method or create new methods for the Post model, you would want to add more tests here. You can add them to the test_post_has_slug method or create new test methods, but their names must begin with test.

      You have successfully created a test case for the Post model where you asserted that slugs are correctly created after saving. In the next step, you will write a test case to test views.

      Step 3 — Using Django’s Test Client

      In this step, you will write a test case that tests a view using the Django test client. The test client is a Python class that acts as a dummy web browser, allowing you to test your views and interact with your Django application the same way a user would. You can access the test client by referring to self.client in your test methods. For example, let us create a test case in test_views.py. First, open the test_views.py file:

      Then add the following:

      ~/my_blog_app/blog/blogsite/tests/test_views.py

      from django.test import TestCase
      
      
      class ViewsTestCase(TestCase):
          def test_index_loads_properly(self):
              """The index page loads properly"""
              response = self.client.get('your_server_ip:8000')
              self.assertEqual(response.status_code, 200)
      

      The ViewsTestCase contains a test_index_loads_properly method that uses the Django test client to visit the index page of the website (http://your_server_ip:8000, where your_server_ip is the IP address of the server you are using). Then the test method checks whether the response has a status code of 200, which means the page responded without any errors. As a result you can be sure that when the user visits, it will respond without errors too.

      Apart from the status code, you can read about other properties of the test client response you can test in the Django Documentation Testing Responses page.

      In this step, you created a test case for testing that the view rendering the index page works without errors. There are now two test cases in your test suite. In the next step you will run them to see their results.

      Step 4 — Running Your Tests

      Now that you have finished building a suite of tests for the project, it is time to execute these tests and see their results. To run the tests, navigate to the blog folder (containing the application’s manage.py file):

      Then run them with:

      You’ll see output similar to the following in your terminal:

      Output

      Creating test database for alias 'default'... System check identified no issues (0 silenced). .. ---------------------------------------------------------------------- Ran 2 tests in 0.007s OK Destroying test database for alias 'default'...

      In this output, there are two dots .., each of which represents a passed test case. Now you’ll modify test_views.py to trigger a failing test. First open the file with:

      Then change the highlighted code to:

      ~/my_blog_app/blog/blogsite/tests/test_views.py

      from django.test import TestCase
      
      
      class ViewsTestCase(TestCase):
          def test_index_loads_properly(self):
              """The index page loads properly"""
              response = self.client.get('your_server_ip:8000')
              self.assertEqual(response.status_code, 404)
      

      Here you have changed the status code from 200 to 404. Now run the test again from your directory with manage.py:

      You’ll see the following output:

      Output

      Creating test database for alias 'default'... System check identified no issues (0 silenced). .F ====================================================================== FAIL: test_index_loads_properly (blogsite.tests.test_views.ViewsTestCase) The index page loads properly ---------------------------------------------------------------------- Traceback (most recent call last): File "~/my_blog_app/blog/blogsite/tests/test_views.py", line 8, in test_index_loads_properly self.assertEqual(response.status_code, 404) AssertionError: 200 != 404 ---------------------------------------------------------------------- Ran 2 tests in 0.007s FAILED (failures=1) Destroying test database for alias 'default'...

      You see that there is a descriptive failure message that tells you the script, test case, and method that failed. It also tells you the cause of the failure, the status code not being equal to 404 in this case, with the message AssertionError: 200 != 404. The AssertionError here is raised at the highlighted line of code in the test_views.py file:

      ~/my_blog_app/blog/blogsite/tests/test_views.py

      from django.test import TestCase
      
      
      class ViewsTestCase(TestCase):
          def test_index_loads_properly(self):
              """The index page loads properly"""
              response = self.client.get('your_server_ip:8000')
              self.assertEqual(response.status_code, 404)
      

      It tells you that the assertion is false, that is, the response status code (200) is not what was expected (404). Preceding the failure message, you can see that the two dots .. have now changed to .F, which tells you that the first test case passed while the second didn’t.

      Conclusion

      In this tutorial, you created a test suite in your Django project, added test cases to test model and view logic, learned how to run tests, and analyzed the test output. As a next step, you can create new test scripts for Python code not in models.py and views.py.

      Following are some articles that may prove helpful when building and testing websites with Django:

      You can also check out our Django topic page for further tutorials and projects.



      Source link

      Business Continuity and Disaster Recovery Basics: Testing 101


      “Luck is what happens when preparation meets opportunity.” – Seneca

      As I covered in another blog post, the first step to any effective business continuity and disaster recovery program is crafting a thoughtful, achievable plan.

      But having a great business continuity and disaster recovery plan on paper doesn’t mean that the work is done. After all, how do you evaluate the efficacy of your plan or make adjustments before you actually need it? The answer: by putting it to the test.

      Disaster Recovery Plan Testing

      I am fond of saying that managed services are a three-legged stool made up of technology, people and processes. If you lose any one leg, the stool falls over. And since an IT department is essentially offering managed services to the wider organization, IT management should think in terms of the same triad.

      Let’s break it down:

      • Technology: the tool or set of tools to be used
      • People: trained, knowledgeable staff to operate the technology
      • Processes: the written instructions for the people to follow when operating the technology. (See another blog I wrote for more information: “6 Processes You Need to Mature Your Managed Services.”)

      For a disaster recovery scenario, you need to test the stool to make sure that each leg is ready and that the people know what to do when the time comes. One useful tool for this is a tabletop exercise (TTX). The purpose of the TTX is to simply get people thinking about what technology they touch and what processes are already in place to support their tasks.

      Tabletop Exercise Steps

      Let’s walk through the stages of a typical TTX.

      No. 1: Develop a Narrative

      Write a quick narrative for the disaster. Start off assuming all your staff are available, and then work through threats that you may have already identified. Some examples:

      • Over the weekend, a train derailed, spilling hazardous materials. The fire department has evacuated an area that includes your headquarters, which contains important servers.
      • Just 10 minutes ago, your firm’s servers were all struck by a ransomware attack.
      • Heavy rains have occurred, and the server room in the basement is starting to flood.

      Now, some questions and prompts for your staff:

      • What should we do?
      • How do we communicate during this?
      • How do we continue to support the business?
      • What are you doing? Show me! (Pointing isn’t usually polite, but this might be a time to do so.)
      • How do we communicate the event to clients, customers, users, etc.?

      Going through the exercise, you’ll likely find that certain recovery processes are not properly documented or even completely missing. For example, your network administrator might not have a written recovery process. Have them and any other relevant staff produce and formalize the process, ready to be shared at the next TTX.

      Continue this way for all the role-players until your team can successfully work through the scenario.  You will want to thoroughly test people’s roles, whether in networking, operating systems, applications, end user access or any other area.

      No. 2: Insert Some Realism

      Unfortunately, we have all seen emergency situations and scenarios, such as the 9/11 terrorist attacks, where key personnel are either missing, incapacitated or even deceased. In less unhappy scenarios, some staff might not be able to tend to work since their home or family was affected by the disaster. For the purposes of a TTX, you can simply designate someone as being on vacation and unreachable, then have them sit out.

      Ask:

      • Who picks up their duties?
      • Does the replacement know where to find the documentation?
      • Can the replacement read and understand the written documentation?

      No. 3: “DIVE, DIVE, DIVE!”—Always Be Prepared

      Just like a submarine commander might call a crash dive drill at the most inopportune time, call a TTX drill on your own team to test the plan. For this, someone might actually be on vacation. Use that to your advantage to make sure that the whole team knows how to step in and how to communicate throughout the drill. You might even plan the drill to coincide with a key player’s vacation for added realism.

      No. 4: Break Away From the Table

      Once you’ve executed your tabletop exercise, now it’s time to do a real test! Have your team actually work through all of the steps of the process to fail over to the recovery site.

      Again, you will want to test that the servers and application can all be turned up at the recovery environment. To prevent data islands, make certain that users can successfully access your applications’ recovery site from where they would operate during a disaster. Here are some questions for user access testing:

      • Can users reach the replica site over the internet/VPN?
      • Can users use remote desktop protocol (RDP) to connect to servers in the replica environment?
      • If users in an office were displaced, could they reach the replica site from home using an SSL VPN?

      No. 5: Bring in a Trusted Service Partner

      The help that an IT service provider provides you doesn’t have to stop with managing your Disaster Recovery as a Service infrastructure or environment. With every INAP DRaaS solution, you get white glove onboarding and periodic testing to make sure that your plans are as robust as you need them to be. Between scheduled tests, you can also test your failover at will, taking your staff beyond tabletop exercises to evaluate their ability to recover the environment on their own. Staying prepared to handle disaster is a continuous process, and we can be there every step of the way to guide you through it.

      Explore INAP Disaster Recovery as a Service.

      LEARN MORE

      Paul Painter
      • Director, Solution Architecture


      Paul Painter is Director, Solution Architecture. He manages the central U.S. region, with his team supporting sales by providing quality presales engineering and optimizing customer onboarding processes. READ MORE



      Source link

      How To Implement Continuous Testing of Ansible Roles Using Molecule and Travis CI on Ubuntu 18.04


      The author selected the Mozilla Foundation to receive a donation as part of the Write for DOnations program.

      Introduction

      Ansible is an agentless configuration management tool that uses YAML templates to define a list of tasks to be performed on hosts. In Ansible, roles are a collection of variables, tasks, files, templates and modules that are used together to perform a singular, complex function.

      Molecule is a tool for performing automated testing of Ansible roles, specifically designed to support the development of consistently well-written and maintained roles. Molecule’s unit tests allow developers to test roles simultaneously against multiple environments and under different parameters. It’s important that developers continuously run tests against code that often changes; this workflow ensures that roles continue to work as you update code libraries. Running Molecule using a continuous integration tool, like Travis CI, allows for tests to run continuously, ensuring that contributions to your code do not introduce breaking changes.

      In this tutorial, you will use a pre-made base role that installs and configures an Apache web server and a firewall on Ubuntu and CentOS servers. Then, you will initialize a Molecule scenario in that role to create tests and ensure that the role performs as intended in your target environments. After configuring Molecule, you will use Travis CI to continuously test your newly created role. Every time a change is made to your code, Travis CI will run molecule test to make sure that the role still performs correctly.

      Prerequisites

      Before you begin this tutorial, you will need:

      Step 1 — Forking the Base Role Repository

      You will be using a pre-made role called ansible-apache that installs Apache and configures a firewall on Debian- and Red Hat-based distributions. You will fork and use this role as a base and then build Molecule tests on top of it. Forking allows you to create a copy of a repository so you can make changes to it without tampering with the original project.

      Start by creating a fork of the ansible-apache role. Go to the ansible-apache repository and click on the Fork button.

      Once you have forked the repository, GitHub will lead you to your fork’s page. This will be a copy of the base repository, but on your own account.

      Click on the green Clone or Download button and you’ll see a box with Clone with HTTPS.

      Copy the URL shown for your repository. You’ll use this in the next step. The URL will be similar to this:

      https://github.com/username/ansible-apache.git
      

      You will replace username with your GitHub username.

      With your fork set up, you will clone it on your server and begin preparing your role in the next section.

      Step 2 — Preparing Your Role

      Having followed Step 1 of the prerequisite How To Test Ansible Roles with Molecule on Ubuntu 18.04, you will have Molecule and Ansible installed in a virtual environment. You will use this virtual environment for developing your new role.

      First, activate the virtual environment you created while following the prerequisites by running:

      • source my_env/bin/activate

      Run the following command to clone the repository using the URL you just copied in Step 1:

      • git clone https://github.com/username/ansible-apache.git

      Your output will look similar to the following:

      Output

      Cloning into 'ansible-apache'... remote: Enumerating objects: 16, done. remote: Total 16 (delta 0), reused 0 (delta 0), pack-reused 16 Unpacking objects: 100% (16/16), done.

      Move into the newly created directory:

      The base role you've downloaded performs the following tasks:

      • Includes variables: The role starts by including all the required variables according to the distribution of the host. Ansible uses variables to handle the disparities between different systems. Since you are using Ubuntu 18.04 and CentOS 7 as hosts, the role will recognize that the OS families are Debian and Red Hat respectively and include variables from vars/Debian.yml and vars/RedHat.yml.

      • Includes distribution-relevant tasks: These tasks include tasks/install-Debian.yml and tasks/install-RedHat.yml. Depending on the specified distribution, it installs the relevant packages. For Ubuntu, these packages are apache2 and ufw. For CentOS, these packages are httpd and firewalld.

      • Ensures latest index.html is present: This task copies over a template templates/index.html.j2 that Apache will use as the web server's home page.

      • Starts relevant services and enables them on boot: Starts and enables the required services installed as part of the first task. For CentOS, these services are httpd and firewalld, and for Ubuntu, they are apache2 and ufw.

      • Configures firewall to allow traffic: This includes either tasks/configure-Debian-firewall.yml or tasks/configure-RedHat-firewall.yml. Ansible configures either Firewalld or UFW as the firewall and whitelists the http service.

      Now that you have an understanding of how this role works, you will configure Molecule to test it. You will write test cases for these tasks that cover the changes they make.

      Step 3 — Writing Your Tests

      To check that your base role performs its tasks as intended, you will start a Molecule scenario, specify your target environments, and create three custom test files.

      Begin by initializing a Molecule scenario for this role using the following command:

      • molecule init scenario -r ansible-apache

      You will see the following output:

      Output

      --> Initializing new scenario default... Initialized scenario in /home/sammy/ansible-apache/molecule/default successfully.

      You will add CentOS and Ubuntu as your target environments by including them as platforms in your Molecule configuration file. To do this, edit the molecule.yml file using a text editor:

      • nano molecule/default/molecule.yml

      Add the following highlighted content to the Molecule configuration:

      ~/ansible-apache/molecule/default/molecule.yml

      ---
      dependency:
        name: galaxy
      driver:
        name: docker
      lint:
        name: yamllint
      platforms:
        - name: centos7
          image: milcom/centos7-systemd
          privileged: true
        - name: ubuntu18
          image: solita/ubuntu-systemd
          command: /sbin/init
          privileged: true
          volumes:
            - /lib/modules:/lib/modules:ro
      provisioner:
        name: ansible
        lint:
          name: ansible-lint
      scenario:
        name: default
      verifier:
        name: testinfra
        lint:
          name: flake8
      

      Here, you're specifying two target platforms that are launched in privileged mode since you're working with systemd services:

      • centos7 is the first platform and uses the milcom/centos7-systemd image.
      • ubuntu18 is the second platform and uses the solita/ubuntu-systemd image. In addition to using privileged mode and mounting the required kernel modules, you're running /sbin/init on launch to make sure iptables is up and running.

      Save and exit the file.

      For more information on running privileged containers visit the official Molecule documentation.

      Instead of using the default Molecule test file, you will be creating three custom test files, one for each target platform, and one file for writing tests that are common between all platforms. Start by deleting the scenario's default test file test_default.py with the following command:

      • rm molecule/default/tests/test_default.py

      You can now move on to creating the three custom test files, test_common.py, test_Debian.py, and test_RedHat.py for each of your target platforms.

      The first test file, test_common.py, will contain the common tests that each of the hosts will perform. Create and edit the common test file, test_common.py:

      • nano molecule/default/tests/test_common.py

      Add the following code to the file:

      ~/ansible-apache/molecule/default/tests/test_common.py

      import os
      import pytest
      
      import testinfra.utils.ansible_runner
      
      testinfra_hosts = testinfra.utils.ansible_runner.AnsibleRunner(
          os.environ['MOLECULE_INVENTORY_FILE']).get_hosts('all')
      
      
      @pytest.mark.parametrize('file, content', [
        ("/var/www/html/index.html", "Managed by Ansible")
      ])
      def test_files(host, file, content):
          file = host.file(file)
      
          assert file.exists
          assert file.contains(content)
      

      In your test_common.py file, you have imported the required libraries. You have also written a test called test_files(), which holds the only common task between distributions that your role performs: copying your template as the web servers homepage.

      The next test file, test_Debian.py, holds tests specific to Debian distributions. This test file will specifically target your Ubuntu platform.

      Create and edit the Ubuntu test file by running the following command:

      • nano molecule/default/tests/test_Debian.py

      You can now import the required libraries and define the ubuntu18 platform as the target host. Add the following code to the start of this file:

      ~/ansible-apache/molecule/default/tests/test_Debian.py

      import os
      import pytest
      
      import testinfra.utils.ansible_runner
      
      testinfra_hosts = testinfra.utils.ansible_runner.AnsibleRunner(
          os.environ['MOLECULE_INVENTORY_FILE']).get_hosts('ubuntu18')
      

      Then, in the same file, you'll add test_pkg() test.

      Add the following code to the file, which defines the test_pkg() test:

      ~/ansible-apache/molecule/default/tests/test_Debian.py

      ...
      @pytest.mark.parametrize('pkg', [
          'apache2',
          'ufw'
      ])
      def test_pkg(host, pkg):
          package = host.package(pkg)
      
          assert package.is_installed
      

      This test will check if apache2 and ufw packages are installed on the host.

      Note: When adding multiple tests to a Molecule test file, make sure there are two blank lines between each test or you'll get a syntax error from Molecule.

      To define the next test, test_svc(), add the following code under the test_pkg() test in your file:

      ~/ansible-apache/molecule/default/tests/test_Debian.py

      ...
      @pytest.mark.parametrize('svc', [
          'apache2',
          'ufw'
      ])
      def test_svc(host, svc):
          service = host.service(svc)
      
          assert service.is_running
          assert service.is_enabled
      

      test_svc() will check if the apache2 and ufw services are running and enabled.

      Finally you will add your last test, test_ufw_rules(), to the test_Debian.py file.

      Add this code under the test_svc() test in your file to define test_ufw_rules():

      ~/ansible-apache/molecule/default/tests/test_Debian.py

      ...
      @pytest.mark.parametrize('rule', [
          '-A ufw-user-input -p tcp -m tcp --dport 80 -j ACCEPT'
      ])
      def test_ufw_rules(host, rule):
          cmd = host.run('iptables -t filter -S')
      
          assert rule in cmd.stdout
      

      test_ufw_rules() will check that your firewall configuration permits traffic on the port used by the Apache service.

      With each of these tests added, your test_Debian.py file will look like this:

      ~/ansible-apache/molecule/default/tests/test_Debian.py

      import os
      import pytest
      
      import testinfra.utils.ansible_runner
      
      testinfra_hosts = testinfra.utils.ansible_runner.AnsibleRunner(
          os.environ['MOLECULE_INVENTORY_FILE']).get_hosts('ubuntu18')
      
      
      @pytest.mark.parametrize('pkg', [
          'apache2',
          'ufw'
      ])
      def test_pkg(host, pkg):
          package = host.package(pkg)
      
          assert package.is_installed
      
      
      @pytest.mark.parametrize('svc', [
          'apache2',
          'ufw'
      ])
      def test_svc(host, svc):
          service = host.service(svc)
      
          assert service.is_running
          assert service.is_enabled
      
      
      @pytest.mark.parametrize('rule', [
          '-A ufw-user-input -p tcp -m tcp --dport 80 -j ACCEPT'
      ])
      def test_ufw_rules(host, rule):
          cmd = host.run('iptables -t filter -S')
      
          assert rule in cmd.stdout
      

      The test_Debian.py file now includes the three tests: test_pkg(), test_svc(), and test_ufw_rules().

      Save and exit test_Debian.py.

      Next you'll create the test_RedHat.py test file, which will contain tests specific to Red Hat distributions to target your CentOS platform.

      Create and edit the CentOS test file, test_RedHat.py, by running the following command:

      • nano molecule/default/tests/test_RedHat.py

      Similarly to the Ubuntu test file, you will now write three tests to include in your test_RedHat.py file. Before adding the test code, you can import the required libraries and define the centos7 platform as the target host, by adding the following code to the beginning of your file:

      ~/ansible-apache/molecule/default/tests/test_RedHat.py

      import os
      import pytest
      
      import testinfra.utils.ansible_runner
      
      testinfra_hosts = testinfra.utils.ansible_runner.AnsibleRunner(
          os.environ['MOLECULE_INVENTORY_FILE']).get_hosts('centos7')
      

      Then, add the test_pkg() test, which will check if the httpd and firewalld packages are installed on the host.

      Following the code for your library imports, add the test_pkg() test to your file. (Again, remember to include two blank lines before each new test.)

      ~/ansible-apache/molecule/default/tests/test_RedHat.py

      ...
      @pytest.mark.parametrize('pkg', [
          'httpd',
          'firewalld'
      ])
      def test_pkg(host, pkg):
          package = host.package(pkg)
      
            assert package.is_installed
      

      Now, you can add the test_svc() test to ensure that httpd and firewalld services are running and enabled.

      Add the test_svc() code to your file following the test_pkg() test:

      ~/ansible-apache/molecule/default/tests/test_RedHat.py

      ...
      @pytest.mark.parametrize('svc', [
          'httpd',
          'firewalld'
      ])
        def test_svc(host, svc):
          service = host.service(svc)
      
          assert service.is_running
          assert service.is_enabled
      

      The final test in test_RedHat.py file will be test_firewalld(), which will check if Firewalld has the http service whitelisted.

      Add the test_firewalld() test to your file after the test_svc() code:

      ~/ansible-apache/molecule/default/tests/test_RedHat.py

      ...
      @pytest.mark.parametrize('file, content', [
          ("/etc/firewalld/zones/public.xml", "<service name="http"/>")
      ])
      def test_firewalld(host, file, content):
          file = host.file(file)
      
          assert file.exists
          assert file.contains(content)
      

      After importing the libraries and adding the three tests, your test_RedHat.py file will look like this:

      ~/ansible-apache/molecule/default/tests/test_RedHat.py

      import os
      import pytest
      
      import testinfra.utils.ansible_runner
      
      testinfra_hosts = testinfra.utils.ansible_runner.AnsibleRunner(
          os.environ['MOLECULE_INVENTORY_FILE']).get_hosts('centos7')
      
      
      @pytest.mark.parametrize('pkg', [
          'httpd',
          'firewalld'
      ])
      def test_pkg(host, pkg):
          package = host.package(pkg)
      
          assert package.is_installed
      
      
      @pytest.mark.parametrize('svc', [
          'httpd',
          'firewalld'
      ])
      def test_svc(host, svc):
          service = host.service(svc)
      
          assert service.is_running
          assert service.is_enabled
      
      
      @pytest.mark.parametrize('file, content', [
          ("/etc/firewalld/zones/public.xml", "<service name="http"/>")
      ])
      def test_firewalld(host, file, content):
          file = host.file(file)
      
          assert file.exists
          assert file.contains(content)
      

      Now that you've completed writing tests in all three files, test_common.py, test_Debian.py, and test_RedHat.py, your role is ready for testing. In the next step, you will use Molecule to run these tests against your newly configured role.

      Step 4 — Testing Against Your Role

      You will now execute your newly created tests against the base role ansible-apache using Molecule. To run your tests, use the following command:

      You'll see the following output once Molecule has finished running all the tests:

      Output

      ... --> Scenario: 'default' --> Action: 'verify' --> Executing Testinfra tests found in /home/sammy/ansible-apache/molecule/default/tests/... ============================= test session starts ============================== platform linux -- Python 3.6.7, pytest-4.1.1, py-1.7.0, pluggy-0.8.1 rootdir: /home/sammy/ansible-apache/molecule/default, inifile: plugins: testinfra-1.16.0 collected 12 items tests/test_common.py .. [ 16%] tests/test_RedHat.py ..... [ 58%] tests/test_Debian.py ..... [100%] ========================== 12 passed in 80.70 seconds ========================== Verifier completed successfully.

      You'll see Verifier completed successfully in your output; this means that the verifier executed all of your tests and returned them successfully.

      Now that you've successfully completed the development of your role, you can commit your changes to Git and set up Travis CI for continuous testing.

      Step 5 — Using Git to Share Your Updated Role

      In this tutorial, so far, you have cloned a role called ansible-apache and added tests to it to make sure it works against Ubuntu and CentOS hosts. To share your updated role with the public, you must commit these changes and push them to your fork.

      Run the following command to add the files and commit the changes you've made:

      This command will add all the files that you have modified in the current directory to the staging area.

      You also need to set your name and email address in the git config in order to commit successfully. You can do so using the following commands:

      • git config user.email "sammy@digitalocean.com"
      • git config user.name "John Doe"

      Commit the changed files to your repository:

      • git commit -m "Configured Molecule"

      You'll see the following output:

      Output

      [master b2d5a5c] Configured Molecule 8 files changed, 155 insertions(+), 1 deletion(-) create mode 100644 molecule/default/Dockerfile.j2 create mode 100644 molecule/default/INSTALL.rst create mode 100644 molecule/default/molecule.yml create mode 100644 molecule/default/playbook.yml create mode 100644 molecule/default/tests/test_Debian.py create mode 100644 molecule/default/tests/test_RedHat.py create mode 100644 molecule/default/tests/test_common.py

      This signifies that you have committed your changes successfully. Now, push these changes to your fork with the following command:

      • git push -u origin master

      You will see a prompt for your GitHub credentials. After entering these credentials, your code will be pushed to your repository and you'll see this output:

      Output

      Counting objects: 13, done. Compressing objects: 100% (12/12), done. Writing objects: 100% (13/13), 2.32 KiB | 2.32 MiB/s, done. Total 13 (delta 3), reused 0 (delta 0) remote: Resolving deltas: 100% (3/3), completed with 2 local objects. To https://github.com/username/ansible-apache.git 009d5d6..e4e6959 master -> master Branch 'master' set up to track remote branch 'master' from 'origin'.

      If you go to your fork's repository at github.com/username/ansible-apache, you'll see a new commit called Configured Molecule reflecting the changes you made in the files.

      Now, you can integrate Travis CI with your new repository so that any changes made to your role will automatically trigger Molecule tests. This will ensure that your role always works with Ubuntu and CentOS hosts.

      Step 6 — Integrating Travis CI

      In this step, you're going to integrate Travis CI into your workflow. Once enabled, any changes you push to your fork will trigger a Travis CI build. The purpose of this is to ensure Travis CI always runs molecule test whenever contributors make changes. If any breaking changes are made, Travis will declare the build status as such.

      Proceed to Travis CI to enable your repository. Navigate to your profile page where you can click the Activate button for GitHub.

      You can find further guidance here on activating repositories in Travis CI.

      For Travis CI to work, you must create a configuration file containing instructions for it. To create the Travis configuration file, return to your server and run the following command:

      To duplicate the environment you've created in this tutorial, you will specify parameters in the Travis configuration file. Add the following content to your file:

      ~/ansible-apache/.travis.yml

      ---
      language: python
      python:
        - "2.7"
        - "3.6"
      services:
        - docker
      install:
        - pip install molecule docker
      script:
        - molecule --version
        - ansible --version
        - molecule test
      

      The parameters you've specified in this file are:

      • language: When you specify Python as the language, the CI environment uses separate virtualenv instances for each Python version you specify under the python key.
      • python: Here, you're specifying that Travis will use both Python 2.7 and Python 3.6 to run your tests.
      • services: You need Docker to run tests in Molecule. You're specifying that Travis should ensure Docker is present in your CI environment.
      • install: Here, you're specifying preliminary installation steps that Travis CI will carry out in your virtualenv.
        • pip install molecule docker to check that Ansible and Molecule are present along with the Python library for the Docker remote API.
      • script: This is to specify the steps that Travis CI needs to carry out. In your file, you're specifying three steps:
        • molecule --version prints the Molecule version if Molecule has been successfully installed.
        • ansible --version prints the Ansible version if Ansible has been successfully installed.
        • molecule test finally runs your Molecule tests.

      The reason you specify molecule --version and ansible --version is to catch errors in case the build fails as a result of ansible or molecule misconfiguration due to versioning.

      Once you've added the content to the Travis CI configuration file, save and exit .travis.yml.

      Now, every time you push any changes to your repository, Travis CI will automatically run a build based on the above configuration file. If any of the commands in the script block fail, Travis CI will report the build status as such.

      To make it easier to see the build status, you can add a badge indicating the build status to the README of your role. Open the README.md file using a text editor:

      Add the following line to the README.md to display the build status:

      ~/ansible-apache/README.md

      [![Build Status](https://travis-ci.org/username/ansible-apache.svg?branch=master)](https://travis-ci.org/username/ansible-apache)
      

      Replace username with your GitHub username. Commit and push the changes to your repository as you did earlier.

      First, run the following command to add .travis.yml and README.md to the staging area:

      • git add .travis.yml README.md

      Now commit the changes to your repository by executing:

      • git commit -m "Configured Travis"

      Finally, push these changes to your fork with the following command:

      • git push -u origin master

      If you navigate over to your GitHub repository, you will see that it initially reports build: unknown.

      build-status-unknown

      Within a few minutes, Travis will initiate a build that you can monitor at the Travis CI website. Once the build is a success, GitHub will report the status as such on your repository as well — using the badge you've placed in your README file:

      build-status-passing

      You can access the complete details of the builds by going to the Travis CI website:

      travis-build-status

      Now that you've successfully set up Travis CI for your new role, you can continuously test and integrate changes to your Ansible roles.

      Conclusion

      In this tutorial, you forked a role that installs and configures an Apache web server from GitHub and added integrations for Molecule by writing tests and configuring these tests to work on Docker containers running Ubuntu and CentOS. By pushing your newly created role to GitHub, you have allowed other users to access your role. When there are changes to your role by contributors, Travis CI will automatically run Molecule to test your role.

      Once you're comfortable with the creation of roles and testing them with Molecule, you can integrate this with Ansible Galaxy so that roles are automatically pushed once the build is successful.



      Source link