Nuvole: Implementing fast end-to-end tests on Drupal with Docker and JSON-RPC

A behind-the-scenes look at how we tackled the challenges of end-to-end testing in a Drupal and Next.js project, using Docker, JSON-RPC, and GitHub Actions to streamline the process.

"We'll add the tests later". Most of us have said this at some point in our software development career. But then "later" never arrives.

That is because implementing tests is hard. Setting up the test runners, writing the test cases, keeping them running; all of it requires extra effort. Even more effort is needed for end-to-end tests (E2E) since they touch every part of an application. However, this effort always pays off and in this post, I will share how we implemented end-to-end tests (E2E) on one of our Drupal projects so that you can save some of the effort needed to get started. We are very happy with how this approach turned out and we plan to use it on future projects too.

Meet ILO Live

ILO Live is the International Labour Organization (ILO)'s online video streaming platform. It hosts both live streams and recordings of events organized all across the world by the ILO.

Image removed.

Image removed.

Under the hood, ILO Live is powered by Next.js and a headless Drupal CMS. Most pages on the site are statically served by Next.js.

In this post, I will take you through how we implemented E2E tests on ILO Live.

The challenges of implementing E2E tests

A common approach when writing frontend tests is to use mocked API responses. But with ILO Live, we wanted to go a step further and use the actual CMS in the tests. By using the actual CMS in the tests, in addition to verifying whether the site works as intended, we can also verify whether changes to the CMS have not caused any functionality on the site to break. But to do this, we had to solve 2 major problems.

  1. How do we orchestrate the CMS into different testing states from the test runner?
  2. How do we quickly start the CMS during each test run?

Orchestrating the Drupal CMS using a private JSON-RPC API

The simplest way to orchestrate a Drupal CMS is to have a test runner click on various elements to create new entities and change their state. But this takes a lot of time to run, especially when multiple entities need to be created during each test run.

To solve this problem, we decided to create a private API using the JSON-RPC module. This API would only be enabled on development and testing instances of the CMS and it exposed several operations which the test runner could use to orchestrate the CMS into different states.

For example, the JSON-RPC API exposed a create.event method for creating new events.

{ "jsonrpc": "2.0", "method": "create.event", "params": { "type": "meeting", "title": "Aragorn meets Gandalf", "start_date": "2021-06-01 8:00:00", "end_date": "2021-06-01 17:00:00" }, "id": 1 }

And a clean.content method for resetting the CMS to the initial state with no content set.

{ "jsonrpc": "2.0", "method": "clean.content", "params": {}, "id": 1 }

We added similar methods for creating other entities on the site and for admin tasks such as reindexing the search index. ILO Live used AWS AppSync to receive real-time updates about the current state of events so to simulate this in the tests, we set up a testing instance of AppSync and implemented methods like the update.livestream method shown below to change the state of this instance.

{ "jsonrpc": "2.0", "method": "update.livestream", "params": { "event_id": "1", "status": "live", }, "id": 1 }

Speeding up the Drupal CMS's startup time using Docker

Now we had to get the Drupal CMS to run while the test runner was running. We were familiar with using Cypress for implementing frontend tests so we decided to use the same for implementing the E2E tests. Since the E2E tests ran assertions on elements on the site / frontend, we decided to store the E2E tests within the frontend repo and use Github Actions to run them since the frontend repo was hosted on Github.

We used GNU Make and Docker Compose during development to run the CMS and its services (i.e. Maria DB and Redis) so initially we tried to clone the CMS and run the commands to start it. Docker is preinstalled by default on the ubuntu-latest runner on GitHub Actions so we were able to easily use it there.

# Simplified Github actions workflow for running the E2E tests name: Tests on: push jobs: cypress: name: Run Cypress tests runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 # Clone the CMS into a subfolder - name: Clone CMS uses: actions/checkout@v4 with: repository: github-org/ilo-live-cms path: ./cms # actions/checkout@v2 only has access to the current repo by default # A custom token needs to be provided to it for it to access a different repo # https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens token: ${{ secrets.CMS_ACCESS_TOKEN }} ref: develop - name: Start CMS run: | cd cms make # More steps to actually run the tests # Simplified Makefile from the CMS default: build install up: @echo "Starting up containers for ilo_live..." docker compose up -d --remove-orphans build: up @echo "Building ilo_live project development environment." docker compose exec -T php bash -c "composer install" install: docker compose exec -T php bash -c "vendor/bin/drush si -y --existing-config --account-pass=\"admin\"" docker compose exec -T php bash -c "vendor/bin/drush deploy" # Simplified docker-compose.yml for the CMS and its services services: mariadb: image: mariadb-image php: image: php-image volumes: - ./:/var/www/html redis: image: redis-image

This setup worked but it took around 2 - 4 minutes to run. Most of this was due to Composer and the Drupal site install.

This delayed each test run so we started looking into ways to improve it. The solution that we came up with was to create a self-contained Docker image for the Drupal CMS. This way, the test runner only needed to pull and run a single image to start the CMS.

To do this, we set up a Github Actions workflow on the CMS repo to build and push a development Docker image. This image had the JSON-RPC API enabled and it used a SQLite database instead of MariaDB.

# Simplied Dockerfile for building the development docker image FROM drupal-image # Replace the site files in the image with our own RUN rm -rf /opt/drupal/* COPY ./ /opt/drupal/ # Install SQLite and dependencies required by composer RUN apt update && apt install -y sqlite3 git zip WORKDIR /opt/drupal RUN composer install # Use custom settings for the development image RUN cp docker/settings.bundle.php web/sites/default/settings.local.php RUN vendor/bin/drush site:install -y --existing-config --account-pass=admin RUN vendor/bin/drush deploy // Simplified settings.bundle.php for the development image <?php // Enable the config split for the JSONRPC-API for the development image $config['config_split.config_split.jsonrpc']['status'] = TRUE; // Use SQLite instead of MariaDB $databases['default']['default'] = array ( 'database' =?> 'sites/default/files/.ht.sqlite', 'prefix' => '', 'driver' => 'sqlite', 'namespace' => 'Drupal\\sqlite\\Driver\\Database\\sqlite', 'autoload' => 'core/modules/sqlite/src/Driver/Database/sqlite/', );

We used the Configuration Split module by our own Fabian Bircher to ensure that the JSON-RPC module is only enabled during development.

Here is the Github Actions workflow we used to build and publish the image to the Github Container registry.

# Simplified Github actions workflow for building and publishing the development docker image name: Build and push development docker image on: push: # Usually the E2E tests can use the image built from the develop branch of the repo # But to test upcoming changes, we can prefix a branch with docker- to have this workflow build an image for it branches: ['develop', 'docker-*'] jobs: build-and-push-image: runs-on: ubuntu-latest # Grant the default `GITHUB_TOKEN` permission to read the current repo and push images to the Github container registry permissions: contents: read packages: write steps: - name: Log in to the Container registry uses: docker/login-action@v3 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.GITHUB_TOKEN }} # This action extracts the tags and labels that should be set on the published image # It exposes them as an output which is consumed by the next step through steps.meta - name: Extract metadata (tags, labels) for Docker id: meta uses: docker/metadata-action@v5 with: images: ghcr.io/github-org/ilo-live-cms # This action uses the git repo as the build context by default - name: Build and push Docker image uses: docker/build-push-action@v6 with: push: true file: docker/Dockerfile tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }}

This is the updated workflow on the frontend repo for using the development image.

# Simplified Github actions workflow for running the E2E tests using the development docker image name: Tests on: push jobs: cypress: name: Run Cypress tests runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Log in to the Container registry uses: docker/login-action@v3 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.ACCESS_TOKEN }} - name: Run CMS container run: | docker run \ --detach \ -p 80:80 \ ghcr.io/github-org/ilo-live-cms:develop

With this setup, the development image is built within 5 minutes each time a pull request is merged into the develop branch of the CMS and the E2E tests workflow can start the CMS in less than a minute.

Image removed.

Image removed.

An added benefit of this approach is that it was much simpler to run the CMS. Previously, the only way to run the CMS was to clone the CMS repo and set up the project locally alongside the database server. Now it just needed a single command.

docker run --detach -p 80:80 ghcr.io/github-org/ilo-live-cms:develop

Putting everything together

To use the JSON-RPC API more conveniently within the Cypress tests, we defined several Custom Commands.

const getRPCPayload = (method, params = {}) => { // Get basic authorization header. const authorization = Buffer.from( [Cypress.env("CMS_USER"), Cypress.env("CMS_PASSWORD")].join(":") ).toString("base64"); return { method: "POST", url: `${Cypress.env("CMS_URL")}/path/to/jsonrpc`, headers: { Accept: "application/json", Authorization: `Basic ${authorization}`, }, body: { jsonrpc: "2.0", method, params, id: Math.floor(Math.random() *1000), }, }; }; Cypress.Commands.add("rpc", (method, params = {}) => { cy.request(getRPCPayload(method, params)).then((response) => { // Run assertions on the response from the CMS to ensure that the call ran successfully expect(response.status).to.eq(200); expect(response.body).to.have.property("result"); return response.body.result; }); }); // Cypress commands can call other commands and build on top of each other // So we created several utility functions to reduce repeated logic Cypress.Commands.add( "createPastEvent", ({ id = 1, type = "meeting", startDate, startHour = 8 } = {}) => { const event = { type, title: `Past ${type} ${id}`, // Not the most robust logic but sufficient for testing start_date: `2023-01-${startDate || id} ${startHour}:00:00`, end_date: `2023-01-${startDate || id} ${startHour + 1}:00:00`, }; cy.rpc("create.event", event); } ); // Command that should be run before each test run Cypress.Commands.add("setup", () => { // Cypress runs each test in isolation so that they can't interfere with each other // To enforce that isolation across the entire system, we use this command to reset CMS to its initial state cy.rpc("clean.content"); // Enable preview mode on Next.js to make it regenerate pages during each request // https://nextjs.org/docs/pages/guides/preview-mode cy.request("/path/to/enable-preview"); });

These custom commands made it much easier to orchestrate the CMS as required.

describe("Event page", () => { beforeEach(() => { cy.setup(); }); it("should display past meetings correctly", () => { // Call the JSONRPC-API to setup the content cy.createPastEvent().then((res) => { // Visit the page corresponding to the newly created content cy.visit(res.path); // Check if the elements on the page are rendered correctly cy.get("h1").should("contain", "Past meeting 1"); cy.contains("1 January 2023"); cy.contains("08:00 - 09:00 GMT+1"); }); }); });

Final results

Ultimately we implemented over 50 end-to-end tests for this project. The final test suite takes between 7 - 8 minutes to run on a standard Github hosted runner. (i.e. Not a large runner) This runtime is acceptable for us for now but it can be improved even further by using Parallelization.

Image removed.

While these tests took us some effort to implement, they definitely paid off in the long term. ILO Live had live event functionality which was mission-critical but very hard to orchestrate manually and with the E2E tests, we were able to ensure that it always worked. These E2E tests gave us the confidence to make big improvements to the codebase since we knew that bugs in key functionality would be instantly revealed and this is the biggest benefit of them all.

Written by Prabashwara Seneviratne, frontend developer at Nuvole and author of Frontend undefined.

Tags: Drupal PlanetNext.jsFrontendTest Driven Development
PubDate

Tags