Skip to main content

To support the fast paced software industry, DevOps and Agile practices have established themselves as a helping hand. With ever increasing large scale digital transformation projects, organizations realize that the only way to shorten the product development cycles is to automate the processes involved in the delivery pipeline.

There are many tools and technologies coming up to support DevOps and Agile, however the standout tool for abridging DevOps is containerization, which gives flexibility in application packaging and automating a number of processes involved in delivery pipeline. With containerization, independent DevOps teams can focus on their core tasks – the Operations team can create containers having all needed configurations and dependencies; on the other hand the Dev team can focus on writing quality code that can be easily deployed. But we will not able to achieve the end goal of Continuous Delivery/Integration until Test Automation is included in this process.

Docker is an open source solution which makes it easier to create, deploy and run applications securely using containers. These containers allow the development teams to deliver the application as a single package with all the moving parts it needed, such as libraries & dependencies.

Those days are gone where applications were required to be deployed on different machines to test on multiple environments (Dev, QA, Staging, UAT). In the situations where we have to test an application on each environment on various configurations like different browsers, application builds or Operating Systems, you can imagine the magnitude of effort that was required to test the application. With the use of Docker we can package the application and its dependencies so that whenever developers release their work, it can run on all environments (Dev, QA and Prod) without need for multiple machines for different browsers, application builds or operating systems.

Even if we have a mechanism to deploy application code on multiple environments in a single go, it will be tedious to test all the environments manually. To overcome this challenge, introducing test automation within a Docker environment could be the best bet. By engaging automation tools in the development cycle, we can reduce efforts & time while achieving more coverage of testing. In addition, cost-effective testing using an open source test automation tool like Selenium can further improve the ROI.

Now, one could question why use Test Automation within Docker when we are already getting ROI of automation without it. The answers to this are the various benefits which we are going to get with Docker. To mention some, first you can test an application on multiple browsers and their different versions without installing them on multiple machines. Second, you can test multiple application builds concurrently without updating the code for different environments. On top of that, you can test your applications on multiple devices/OS/Browser combinations on a single machine.

Selenium within Docker will make it much quicker to test different environments by using the pre-configured containers. You can spin up a new Selenium container when you need, discard it if not required anymore and then create a new one. Also team members can use the same container regardless of what operating system they use, so there is a lower chance of compatibility issues between the environments.

The process to setup a Test Automation solution within Docker is very simple and straight forward. First we have to create two files; a Docker file and a Docker-compose file. Then we need images of desired tools/software from the Docker hub, which is the world’s largest library of container images. In Docker hub we can find images of various applications like Browsers, Tools, Utilities, etc., which can be pulled within the hub when required. For configuring Selenium within Docker we will be requiring Images such as Selenium/hub, Selenium/chrome etc.

After this, we have to create our own images using Docker files, which contain all the commands which can be run on the command line to assemble an image with the desired configuration (application builds, environments, automation scripts etc.).

The next step is to create a Docker Compose file having information regarding connection parameters of different images used in the automation. Compose file is a tool for defining and running multiple containers with a single command.  We can also use Docker Swarm and Kubernetes (k8s) for orchestration of Docker containers.

The final steps involved in achieving test automation is to run the Docker Compose file. Once this file is executed, it runs Automation Scripts on the desired environment/configuration. The execution results of the Test execution can be saved in the CI tools like Jenkins, VSO, Bamboo, Jira, ALM, TFS, etc.

Use of containers in software development is becoming a widely used approach and using them in the testing process is no exception. Test automation with Docker definitely makes sense with DevOps and Agile methodologies since it makes testing more efficient and dynamic by reducing test dependencies and giving flexibility to test applications with diverse configurations, without creating multiple test environments.

Vikas Shukla

About the author

Vikas Shukla

An agile and customer focused professional, Vikas has been Pyramid Consulting's Director of Quality Assurance since 2009. Vikas manages a team of 90+ QA professionals and leads our QA Center of Excellence.

Cookie Notice

This site uses cookies to provide you with a more responsive and personalized service. By using this site you agree to our privacy policy & the use of cookies. Please read our privacy policy for more information on the cookies we use and how to delete or block them. More info

Back to top