Simplifying Playwright Testing with Docker

Playwright Testing with Docker

As web applications adopt modern architecture, the complexity with which they are built grows. To ensure that the web application works seamlessly, thorough testing is required, which can be tedious. Playwright is a popular and powerful tool with various features to perform end-to-end testing for such web applications. However, the environment’s configuration to execute tests using Playwright can be complex due to the requirement of multiple dependencies.

This is where Docker can help ease the pain of setting up the system and streamline the process. Using Docker, the exact testing environment can be defined within a container image. This container image can be easily replicated for multiple environments maintaining consistency. Hence, a managed test environment is easy to set up to execute Playwright tests.

In this article, we will explore how Docker can help us scale our Playwright tests with smoother execution.

What is Docker?

Docker is a platform for running applications in isolated containers built from images. You can assume containers to be simple boxes, these boxes will hold whatever you define, for example- files, libraries, environment variables, etc. Docker will take this box along with its contents and run it as a lightweight, self-contained, and portable software package.

One can define standardized application environments that hide the infrastructure complexity using Docker. Through it you can easily scale and ship software.

Just imagine if you could package all the dependencies and configurations of an application in a single unit that runs the same code exactly in the same way anywhere. You need not worry about the machine type, its setup, the environment configurations, etc. This is what Docker containers accomplish.

What value does Docker deliver?

Docker helps solve some real-world problems in the software development cycle. We will discuss them and see how Docker comes to the rescue-

  • Managing Dependency Conflicts

Applications rely on specific versions for language, configurations, frameworks, etc. There might be conflicts with versions installed on the host system. This causes ‘dependency hell’. Docker provides isolation, hence helps in voiding these issues.

  • Configuration Complexity

To configure the runtime environment for an application, there are multiple steps required to set up the different components like the databases, web servers, etc. Docker lets you describe the exact components and configurations in a simple text-based docker file. This serves as the build recipe for an immutable container image that launches with the same configurations that have been preset.

  • Environment Inconsistency

There can be a huge difference between the local workstations and the production infrastructure environments. This difference can lead to a mismatch between the code executed locally and in the production environment. Docker ensures that environments are replicas of each other by capturing the OS system files, dependencies, networks, and configurations that wrap the application in a portable standardized capsule.

  • Scaling Difficulties

Scaling applications can be a costly and resource-intensive affair if traditional virtual machines or hardware are being used. Docker, being lightweight, allows instant spinning up of multiple replicas of the same application without variability in the environment. This helps in scaling up the application in a much simpler and more economical way.

Docker lets the developers define the required dependencies, system configurations, and environments for an application, addressing the core challenges of managing external state and variability.

Understanding and Installing Docker

We will first learn about the components of Docker and its system structure, and then we will see how we can install Docker to start using it.

Docker is built using a client-server model and has three main components that allow for building, distributing, and running containerized applications.

Docker Engine

The Docker engine is the Docker client that the developers interact with. It provides the Docker command line interface and API that powers the full container development life cycle which includes:

  • Building Docker images using the Dockerfiles
  • Management of local and remote images
  • Pushing images to registries
  • Streamlining of deployment
  • Configuration of container volume and network
  • Instantiation of containers from images

Apps can be published or containerized and run on platforms supporting Docker using the engine.

The Docker Daemon

The daemon continuously runs as a server, in the backend. It handles intensive orchestration tasks while receiving runtime commands from the developers through the API or the Docker command line interface(CLI). Its main responsibilities are-

  • Locally getting the container images from the registry repositories.
  • Storing the images fetched in the host system’s disk storage, to make them instantly available.
  • As per the configuration specifications, transforming the stored images into running containers.
  • Attach the file system mounts and network connections to the containers.
  • Automating the workflows to deploy and maintain containers.
  • Tracking of resource usage and health checks of the containers.
  • Streaming logs, and system events back to the client for visibility.

The daemon works continuously to automatically coordinate the complete container lifecycle in the background so that the developer can focus on building the applications rather than working on infrastructure and operations.

Docker Registries

Registries act as the central hub for storing, referencing, and fetching docker images. Images hold the entire application along with the environment into portable artifacts that can be easily transferred between registries and host systems. It comes with several benefits like-

  • Standardized components in the form of images between the projects.
  • Distribution of images within the pipeline to facilitate continuous integration.
  • Centralized repositories to manage artifact versions and dependencies.
  • Controlled image access to internal registries allowing for secure collaboration.

With this robust client-server architecture, Docker binds the application environments into images that can be easily ported, without any additional need for dependency management, thereby ensuring accurate delivery across different infrastructures.

Now, we will see how we can install Docker on our system.

Installing Docker

Docker can be installed on any of your devices, be it a Windows system or a Mac device. You should ensure that your Windows system requirements are fulfilled before you install the Docker desktop. You can follow the steps mentioned in the official documentation to install the Docker desktop on Windows.

Similarly, to install Docker on your Mac device you need to follow the instructions from the official documentation to install Docker Desktop on Mac. Since I am using a Mac device for the demonstration I have downloaded the dmg file for my system. Once it is downloaded double-click on the downloaded file and follow through the instructive steps. After you perform the necessary steps you will see the Docker Desktop window below:

Installing Docker

After the required installations we will now jump on working with our Playwright test to execute using Docker.

Writing and Executing the Playwright Tests

To begin writing the tests, you need to install Playwright first. If you do not already have it, you may refer to our article on Playwright Introduction to start your journey. Once successfully installed, the next step is to create a folder in your local system, the project you will be working on, where you will be writing your tests. I have created a folder by the name PlaywrightDocker on my system for the demo. We have already installed Docker Desktop manually, to confirm if it is all set to be used, you can execute the below command in the Terminal and see the result:

docker –version

You will see that the version installed on your system is displayed:

image9

Let us quickly create a test file under the tests folder, using Javascript, and write a simple test to launch the browser and navigate to google.com. We will then fetch the title of the page and assert that it matches “Google”.

The below code is used to execute the case:

const { test, expect } = require('@playwright/test');


test('basic test', async ({ page }) => {
 // Launches the google homepage
 await page.goto('https://www.google.com');
 //Fetches the title of the page and stores it in a constant
 const title = await page.title();
 //Asserts that the title is Google
 expect(title).toBe('Google');
});

Let us execute this code to see for results by running the below command:

npx playwright test

By default the test will be executed in headless mode, if you want to see the browser opening, you may specify the run to be headed using the command below-

npx playwright test –headed

Upon execution you will notice that your test passes and the same is visible in the console logs.

playwright Docker test results

Using Docker to Execute Playwright Tests

Now that we have seen that our test executes fine, we will create a Dockerfile as the first step to start using Docker to run our tests. We will write the below code in our Dockerfile-

FROM mcr.microsoft.com/playwright:v1.40.1-focal


# Set working directory
WORKDIR /app

# Copy test code
COPY tests /app/tests
COPY package.json /app/
# Install dependencies
RUN npm cache clean --force
RUN npm install -g playwright
RUN npm install
RUN apt-get update && apt-get install -y wget gnupg ca-certificates && \
   curl -sL https://deb.nodesource.com/setup_16.x | bash - && \
   apt install -y nodejs

Once the Dockerfile is saved, we will child the Docker image using the below command:

docker build -t playwright-test 

In the above command, docker build is the base command to build a Docker image. -t playwright-test tags the image with the name “playwright-test”.  And the dot(.) in the end specifies the current directory as the build context.

Upon execution of the command above you will see the output as shown below:

docker build playwright test

Now, if you go back to Docker Desktop which you installed earlier, and go to the Images tab, you will see the Docker Image you just created-

image4

Finally we have the Docker image ready and we will now execute the test using the image we just created in a container. To do so execute the below command:

docker run -it --rm playwright-test

In the above command, docker run is the base command to launch a container. -it is used for the optional flags for interactive terminal attach and pseudo-TTY. –rm will remove the container after the execution ends. And playwright-test is the name of the image to be used for the container. Once you execute it, you will see that the container starts –

image8
image3 1

Now, write the below command and hit on Enter:

npm run test
image1 1

As you can see from the console output above, our test passed successfully when executed through docker container. You can see similar results in Docker Desktop in the Logs section-

image6

And this is how you can execute your tests independently using Docker and scale your execution without giving much thought about the infrastructure and operations requirements.

Conclusion

In this article, we talked about the basic concepts of Docker- it enables you to package all the contents of your application, be it the code, the environment settings, configurations, dependencies, etc in one package and allows using it across any platform. Docker eases out the pain of management of environment settings and operational dependencies by addressing different challenges like environment inconsistency, configuration and dependenicy complexity, etc. Docker uses the client-server architecture and has Docker engine, Docker daemon and Docker registries as its components. These components help to build and run containers along with allowing their management and hosting. Execution of tests using Docker can be very quick when you want to run tests in isolated containers or even scale the execution by using parallel testing. 

Combining the comprehensive browser testing capabilities of Playwright and Docker’s lightweight containerization yields a robust framework for testing modern web applications, This integration can be used effectively by teams to enhance quality and confidence in the end to end development cycle.