One place for hosting & domains

      Testing

      Painless and Efficient Ways to Do Usability Testing


      This Tech Talk will be streaming live on Tuesday, July 28, 2020, 11:30 a.m.–12:30 p.m. ET.
      RSVP for free on GoToWebinar here.

      About the Talk

      Usability testing helps product builders and designers make informed design decisions that increase the odds of creating helpful and delightful user experiences.

      Rafael Mojica, VP of User Experience at DigitalOcean, will guide you through his favorite ways to carry out usability testing, and share insights on how you can validate the decisions you make while building.

      What You’ll Learn

      • What, when, and with whom to validate your product/service design decisions
      • Tips on what to do before, during, and after a usability study
      • Best practices on successful usability testing

      This Talk is Designed For

      Product builders and designers.

      About the Presenter

      Rafael Mojica, VP of User Experience at DigitalOcean, is a multidisciplinary interaction designer. He’s passionate about expanding the benefits of design thinking and communications into all aspects of a business.

      How to Join

      This Tech Talk is free and open to everyone. Join the live event on Tuesday, July 28, 2020, 11:30 a.m.–12:30 p.m. ET by registering on GoToWebinar here. and Rafael will be answering questions at the end.

      If you can’t make the live event, the recording and transcript will be published here as soon as it’s available.



      Source link

      How To Add Unit Testing to Your Django Project


      The author selected the Open Internet/Free Speech Fund to receive a donation as part of the Write for DOnations program.

      Introduction

      It is nearly impossible to build websites that work perfectly the first time without errors. For that reason, you need to test your web application to find these errors and work on them proactively. In order to improve the efficiency of tests, it is common to break down testing into units that test specific functionalities of the web application. This practice is called unit testing. It makes it easier to detect errors because the tests focus on small parts (units) of your project independently from other parts.

      Testing a website can be a complex task to undertake because it is made up of several layers of logic like handling HTTP requests, form validation, and rendering templates. However Django provides a set of tools that makes testing your web application seamless. In Django, the preferred way to write tests is to use the Python unittest module, although it is possible to use other testing frameworks.

      In this tutorial, you will set up a test suite in your Django project and write unit tests for the models and views in your application. You will run these tests, analyze their results, and learn how to find the causes of failing tests.

      Prerequisites

      Before beginning this tutorial, you’ll need the following:

      Step 1 — Adding a Test Suite to Your Django Application

      A test suite in Django is a collection of all the test cases in all the apps in your project. To make it possible for the Django testing utility to discover the test cases you have, you write the test cases in scripts whose names begin with test. In this step, you’ll create the directory structure and files for your test suite, and create an empty test case in it.

      If you followed the Django Development tutorial series, you’ll have a Django app called blogsite.

      Let’s create a folder to hold all our testing scripts. First, activate the virtual environment:

      • cd ~/my_blog_app
      • . env/bin/activate

      Then navigate to the blogsite app directory, the folder that contains the models.py and views.py files, and then create a new folder called tests:

      • cd ~/my_blog_app/blog/blogsite
      • mkdir tests

      Next, you’ll turn this folder into a Python package, so add an __init__.py file:

      • cd ~/my_blog_app/blog/blogsite/tests
      • touch __init__.py

      You’ll now add a file for testing your models and another for testing your views:

      • touch test_models.py
      • touch test_views.py

      Finally, you will create an empty test case in test_models.py. You will need to import the Django TestCase class and make it a super class of your own test case class. Later on, you will add methods to this test case to test the logic in your models. Open the file test_models.py:

      Now add the following code to the file:

      ~/my_blog_app/blog/blogsite/tests/test_models.py

      from django.test import TestCase
      
      class ModelsTestCase(TestCase):
          pass
      

      You’ve now successfully added a test suite to the blogsite app. Next, you will fill out the details of the empty model test case you created here.

      Step 2 — Testing Your Python Code

      In this step, you will test the logic of the code written in the models.py file. In particular, you will be testing the save method of the Post model to ensure it creates the correct slug of a post’s title when called.

      Let’s begin by looking at the code you already have in your models.py file for the save method of the Post model:

      • cd ~/my_blog_app/blog/blogsite
      • nano models.py

      You’ll see the following:

      ~/my_blog_app/blog/blogsite/models.py

      class Post(models.Model):
          ...
          def save(self, *args, **kwargs):
              if not self.slug:
                  self.slug = slugify(self.title)
              super(Post, self).save(*args, **kwargs)
          ...
      

      We can see that it checks whether the post about to be saved has a slug value, and if not, calls slugify to create a slug value for it. This is the type of logic you might want to test to ensure that slugs are actually created when saving a post.

      Close the file.

      To test this, go back to test_models.py:

      Then update it to the following, adding in the highlighted portions:

      ~/my_blog_app/blog/blogsite/tests/test_models.py

      from django.test import TestCase
      from django.template.defaultfilters import slugify
      from blogsite.models import Post
      
      
      class ModelsTestCase(TestCase):
          def test_post_has_slug(self):
              """Posts are given slugs correctly when saving"""
              post = Post.objects.create(title="My first post")
      
              post.author = "John Doe"
              post.save()
              self.assertEqual(post.slug, slugify(post.title))
      

      This new method test_post_has_slug creates a new post with the title "My first post" and then gives the post an author and saves the post. After this, using the assertEqual method from the Python unittest module, it checks whether the slug for the post is correct. The assertEqual method checks whether the two arguments passed to it are equal as determined by the "==" operator and raises an error if they are not.

      Save and exit test_models.py.

      This is an example of what can be tested. The more logic you add to your project, the more there is to test. If you add more logic to the save method or create new methods for the Post model, you would want to add more tests here. You can add them to the test_post_has_slug method or create new test methods, but their names must begin with test.

      You have successfully created a test case for the Post model where you asserted that slugs are correctly created after saving. In the next step, you will write a test case to test views.

      Step 3 — Using Django’s Test Client

      In this step, you will write a test case that tests a view using the Django test client. The test client is a Python class that acts as a dummy web browser, allowing you to test your views and interact with your Django application the same way a user would. You can access the test client by referring to self.client in your test methods. For example, let us create a test case in test_views.py. First, open the test_views.py file:

      Then add the following:

      ~/my_blog_app/blog/blogsite/tests/test_views.py

      from django.test import TestCase
      
      
      class ViewsTestCase(TestCase):
          def test_index_loads_properly(self):
              """The index page loads properly"""
              response = self.client.get('your_server_ip:8000')
              self.assertEqual(response.status_code, 200)
      

      The ViewsTestCase contains a test_index_loads_properly method that uses the Django test client to visit the index page of the website (http://your_server_ip:8000, where your_server_ip is the IP address of the server you are using). Then the test method checks whether the response has a status code of 200, which means the page responded without any errors. As a result you can be sure that when the user visits, it will respond without errors too.

      Apart from the status code, you can read about other properties of the test client response you can test in the Django Documentation Testing Responses page.

      In this step, you created a test case for testing that the view rendering the index page works without errors. There are now two test cases in your test suite. In the next step you will run them to see their results.

      Step 4 — Running Your Tests

      Now that you have finished building a suite of tests for the project, it is time to execute these tests and see their results. To run the tests, navigate to the blog folder (containing the application’s manage.py file):

      Then run them with:

      You’ll see output similar to the following in your terminal:

      Output

      Creating test database for alias 'default'... System check identified no issues (0 silenced). .. ---------------------------------------------------------------------- Ran 2 tests in 0.007s OK Destroying test database for alias 'default'...

      In this output, there are two dots .., each of which represents a passed test case. Now you’ll modify test_views.py to trigger a failing test. First open the file with:

      Then change the highlighted code to:

      ~/my_blog_app/blog/blogsite/tests/test_views.py

      from django.test import TestCase
      
      
      class ViewsTestCase(TestCase):
          def test_index_loads_properly(self):
              """The index page loads properly"""
              response = self.client.get('your_server_ip:8000')
              self.assertEqual(response.status_code, 404)
      

      Here you have changed the status code from 200 to 404. Now run the test again from your directory with manage.py:

      You’ll see the following output:

      Output

      Creating test database for alias 'default'... System check identified no issues (0 silenced). .F ====================================================================== FAIL: test_index_loads_properly (blogsite.tests.test_views.ViewsTestCase) The index page loads properly ---------------------------------------------------------------------- Traceback (most recent call last): File "~/my_blog_app/blog/blogsite/tests/test_views.py", line 8, in test_index_loads_properly self.assertEqual(response.status_code, 404) AssertionError: 200 != 404 ---------------------------------------------------------------------- Ran 2 tests in 0.007s FAILED (failures=1) Destroying test database for alias 'default'...

      You see that there is a descriptive failure message that tells you the script, test case, and method that failed. It also tells you the cause of the failure, the status code not being equal to 404 in this case, with the message AssertionError: 200 != 404. The AssertionError here is raised at the highlighted line of code in the test_views.py file:

      ~/my_blog_app/blog/blogsite/tests/test_views.py

      from django.test import TestCase
      
      
      class ViewsTestCase(TestCase):
          def test_index_loads_properly(self):
              """The index page loads properly"""
              response = self.client.get('your_server_ip:8000')
              self.assertEqual(response.status_code, 404)
      

      It tells you that the assertion is false, that is, the response status code (200) is not what was expected (404). Preceding the failure message, you can see that the two dots .. have now changed to .F, which tells you that the first test case passed while the second didn’t.

      Conclusion

      In this tutorial, you created a test suite in your Django project, added test cases to test model and view logic, learned how to run tests, and analyzed the test output. As a next step, you can create new test scripts for Python code not in models.py and views.py.

      Following are some articles that may prove helpful when building and testing websites with Django:

      You can also check out our Django topic page for further tutorials and projects.



      Source link

      Business Continuity and Disaster Recovery Basics: Testing 101


      “Luck is what happens when preparation meets opportunity.” – Seneca

      As I covered in another blog post, the first step to any effective business continuity and disaster recovery program is crafting a thoughtful, achievable plan.

      But having a great business continuity and disaster recovery plan on paper doesn’t mean that the work is done. After all, how do you evaluate the efficacy of your plan or make adjustments before you actually need it? The answer: by putting it to the test.

      Disaster Recovery Plan Testing

      I am fond of saying that managed services are a three-legged stool made up of technology, people and processes. If you lose any one leg, the stool falls over. And since an IT department is essentially offering managed services to the wider organization, IT management should think in terms of the same triad.

      Let’s break it down:

      • Technology: the tool or set of tools to be used
      • People: trained, knowledgeable staff to operate the technology
      • Processes: the written instructions for the people to follow when operating the technology. (See another blog I wrote for more information: “6 Processes You Need to Mature Your Managed Services.”)

      For a disaster recovery scenario, you need to test the stool to make sure that each leg is ready and that the people know what to do when the time comes. One useful tool for this is a tabletop exercise (TTX). The purpose of the TTX is to simply get people thinking about what technology they touch and what processes are already in place to support their tasks.

      Tabletop Exercise Steps

      Let’s walk through the stages of a typical TTX.

      No. 1: Develop a Narrative

      Write a quick narrative for the disaster. Start off assuming all your staff are available, and then work through threats that you may have already identified. Some examples:

      • Over the weekend, a train derailed, spilling hazardous materials. The fire department has evacuated an area that includes your headquarters, which contains important servers.
      • Just 10 minutes ago, your firm’s servers were all struck by a ransomware attack.
      • Heavy rains have occurred, and the server room in the basement is starting to flood.

      Now, some questions and prompts for your staff:

      • What should we do?
      • How do we communicate during this?
      • How do we continue to support the business?
      • What are you doing? Show me! (Pointing isn’t usually polite, but this might be a time to do so.)
      • How do we communicate the event to clients, customers, users, etc.?

      Going through the exercise, you’ll likely find that certain recovery processes are not properly documented or even completely missing. For example, your network administrator might not have a written recovery process. Have them and any other relevant staff produce and formalize the process, ready to be shared at the next TTX.

      Continue this way for all the role-players until your team can successfully work through the scenario.  You will want to thoroughly test people’s roles, whether in networking, operating systems, applications, end user access or any other area.

      No. 2: Insert Some Realism

      Unfortunately, we have all seen emergency situations and scenarios, such as the 9/11 terrorist attacks, where key personnel are either missing, incapacitated or even deceased. In less unhappy scenarios, some staff might not be able to tend to work since their home or family was affected by the disaster. For the purposes of a TTX, you can simply designate someone as being on vacation and unreachable, then have them sit out.

      Ask:

      • Who picks up their duties?
      • Does the replacement know where to find the documentation?
      • Can the replacement read and understand the written documentation?

      No. 3: “DIVE, DIVE, DIVE!”—Always Be Prepared

      Just like a submarine commander might call a crash dive drill at the most inopportune time, call a TTX drill on your own team to test the plan. For this, someone might actually be on vacation. Use that to your advantage to make sure that the whole team knows how to step in and how to communicate throughout the drill. You might even plan the drill to coincide with a key player’s vacation for added realism.

      No. 4: Break Away From the Table

      Once you’ve executed your tabletop exercise, now it’s time to do a real test! Have your team actually work through all of the steps of the process to fail over to the recovery site.

      Again, you will want to test that the servers and application can all be turned up at the recovery environment. To prevent data islands, make certain that users can successfully access your applications’ recovery site from where they would operate during a disaster. Here are some questions for user access testing:

      • Can users reach the replica site over the internet/VPN?
      • Can users use remote desktop protocol (RDP) to connect to servers in the replica environment?
      • If users in an office were displaced, could they reach the replica site from home using an SSL VPN?

      No. 5: Bring in a Trusted Service Partner

      The help that an IT service provider provides you doesn’t have to stop with managing your Disaster Recovery as a Service infrastructure or environment. With every INAP DRaaS solution, you get white glove onboarding and periodic testing to make sure that your plans are as robust as you need them to be. Between scheduled tests, you can also test your failover at will, taking your staff beyond tabletop exercises to evaluate their ability to recover the environment on their own. Staying prepared to handle disaster is a continuous process, and we can be there every step of the way to guide you through it.

      Explore INAP Disaster Recovery as a Service.

      LEARN MORE

      Paul Painter
      • Director, Solution Architecture


      Paul Painter is Director, Solution Architecture. He manages the central U.S. region, with his team supporting sales by providing quality presales engineering and optimizing customer onboarding processes. READ MORE



      Source link