Mario Hernandez: Building an automated DDEV-based Drupal environment

A successful training experience begins before we step foot in the training room. Or, in these days of distance learning, it begins before students login to your training platform. The challenge is having an environment that is easy for students to setup and provides all the tooling required for the training. Configuring a native development environment is no easy task. Web development tools have gotten more complex with the years and it's a huge barrier for even experienced developers. My goal in setting up this new environment was to have everything completely configured and automated so students only needed to run one command. Very ambitious.

WARNING: The codebase shared in this post is only intended for local development. DO NOT use this project in a production website.

About DDEV: The official name is DDEV-Local. For simplicity I use DDEV in this tutorial.

Here are the tools and configuration requirements for this environment:

  • Docker
  • DDEV
  • Composer
  • Drush
  • Drupal 8 and contrib modules
  • Pre-built Drupal entities (content types, paragraph types, views, view modes, taxonomy, image styles, and more)
  • Custom Drupal 8 theme
  • Twig debugging enabled by default and Drupal cache disabled by default
  • NodeJS, NPM, and NVM
  • Pattern Lab
  • Gulp, ESLint, Sass Lint, BrowserSync, Autoprefixer, and many many more node dependencies

Desired behavior

My main objective was to simplify the building and interaction with the environment, by:

  • Only require Docker and DDEV to be installed on host computer
  • Reducing the number of steps for building the environment to one command, ddev start
  • Execute most if not all commands in the containers, not the host machine (Drush, Composer, Pattern Lab, etc.)
  • Access Pattern Lab running in the web container, from the host machine

Yes, it is crazy, but let's see how this turned out

First things first. As I said before, installing Docker and DDEV are the only two things students will need to install. There is no way around this. Luckily there are a lot of resources to help you with this. The DDEV docs is a great place to start.

Setting up a Drupal site

Before we start, I'd like to clarify that this post focuses on outlining the process and steps for automating a local environment. However, before we can automate, we need to build all the pieces. If you just want to grab the final product here's the repo, otherwise, read on.

As of Drupal 8.8.0, Composer project templates are now available as part of Drupal core. These project templates are recommended for building new Drupal sites as they serve as a starting point for creating a Composer-managed Drupal site.

Two things we will be doing with our Drupal project:

  1. Drupal dependencies and modules will not be commited to the git repo. They will be downloaded with Composer when the project is being built.
  2. Composer will not be installed in the host system, instead, we will use the composer version that comes with DDEV's web container.

Let's start

  1. Create a new directory for your project. The directory name should be lowercase and alpha-numeric characters only. For this example I will name it drupaltraining and will create it in my /Sites directory. Using your command line tool create the new directory

    mkdir drupaltraining
  2. Now navigate into the newly created directory

    cd drupaltraining
  3. Run the DDEV command below to setup a new Drupal project

    ddev config --project-type=drupal8 --docroot=web --create-docroot
    • This will create a .ddev directory with basic Drupal configuration.
    • It will also create settings.php and settings.ddev.php` files inside web/sites/default.
    • Finally, it creates a Drush directory.
    • Keep in mind, Drupal is not in place yet, this simply sets the environment for it.
  4. Now start DDEV to create the containers.

    ddev start
    • After the project has been built, you will see a link to open Drupal in the browser. For this project the link should be https://drupaltraining.ddev.site (ignore for now). If you have not installed mkcert (mkcert -install), the link you see may use http instead of https. That's fine, but I'd recommend always using https by installing mkcert.
  5. Now we are going to create the codebase for Drupal by using the official Drupal community
    Composer template
    .

    ddev composer create "drupal/recommended-project:^8"

    Respond Yes when asked if it's okay to override everything in the existing directory.

    • This will setup the codebase for Drupal. This could take a while depending on your connection.
    • composer.json will be updated so Drupal is added as a project's dependency.
    • New settings.php and settings.ddev.php files are created.
    • Using ddev composer create uses the version of composer that comes as part of the DDEV's web container. This means composer is not required to be installed in the host's computer.
    • One thing I love about using the new composer template is the nice list of next steps you get after downloading Drupal's code base. So useful!
  6. Now let's grab some modules

    ddev composer require drupal/devel drupal/admin_toolbar drupal/paragraphs drupal/components drupal/viewsreference drupal/entity_reference_revisions drupal/twig_field_value
    • This will download the modules and update composer.json to set them as dependencies along with Drupal. The modules and Drupal's codebase will not be commited to our repo.
  7. Let's add Drush to the project

    ddev composer require drush/drush
    • Installing Drush as part of the project which will run in the web container will make it possible to run drush commands (ddev drush <command>), even if the host does not have drush installed.
  8. Now launch Drupal in the browser to complete the installation

    ddev launch
    • This will launch Drupal's intall page. Drupal's url will be https://drupaltraining.ddev.site. Complete the installation by using the Standard profile. Since the settings.ddev.php file already exists, the database configuration screen will be skipped from the installation.
  9. Make a copy of example.settings.local.php into web/sites/default

    cp web/sites/example.settings.local.php web/sites/default/settings.local.php
    • This is recommended to override or add new configuration to your Drupal site.
  10. Update settings.php to include settings.local.php.

    if (file_exists($app_root . '/' . $site_path . '/settings.local.php')) { include $app_root . '/' . $site_path . '/settings.local.php'; }
    • I'd suggest adding the above code right after the settings.ddev.php include already in settings.php. This will allow us to override configuration found in settings.ddev.php.
  11. Let's change Drupal's default config directory. Doing this will export any configuration outside the web directory in Drupal. This is a good security measure. Let's also ignore a couple of folders to avoid drupal errors.

    if (empty($settings['config_sync_directory'])) { $settings['config_sync_directory'] = '../config/sync'; } $settings['file_scan_ignore_directories'] = [ 'node_modules', 'bower_components', ];
    • First block changes the default sync directory to drupaltraining/config/sync. If you look inside settings.ddev.php you will see that this directory is inside web. We want to store any configuration changes outside the web directory.
    • Second block sets up Drupal to ignore node_modules and bower_components. Drupal may look for twig templates inside these directories and could cause Drupal to crash. Ignoring these directories solves these issues.
  12. Since we've made changes to DDEV's configuration, restart DDEV

    ddev restart

That's quite the process, isn't it? The good news is this entire process will be eliminated when we finish automating the environment.

Drupal 8 custom theme

The project's theme is called training_theme. This is a node-based theme, and will be built with Mediacurrent's theme generator, which will provide:

  • A best-practices Drupal 8 theme
  • Pattern Lab integration
  • Automated Front-End workflow
  • Component-based-ready environment
  • Production-ready theme

The final DDEV project will include the new Drupal 8 theme so there is no need to create it now, but if you want to see how the Theme Generator works, Watch the video tutorial I recorded.

Automating our environment

Now that Drupal has been setup let's begin the automation process.

Dockerfile

A web Docker container comes with Node and NPM installed. This will work in most cases, but the Drupal theme may use a version of node not currently available in the container. In addition, the web container does not include Node Version Manager (NVM), to manage multiple node versions. If the tools we need are not available in the web or db images/containers, there are ways to modify them to include the required tools. One of those ways is an add-on Dockerfile in your project's .ddev/web-build or .ddev/db-build, depending which container you are trying to modify.

Inside .ddev/web-build create a file called Dockerfile (case sensitive), and in it, add the following code:

ARG BASE_IMAGE FROM $BASE_IMAGE ENV NVM_DIR=/usr/local/nvm ENV NODE_DEFAULT_VERSION=v14.2.0 RUN curl -sL https://raw.githubusercontent.com/nvm-sh/nvm/v0.34.0/install.sh -o install_nvm.sh RUN mkdir -p $NVM_DIR && bash install_nvm.sh RUN echo "source $NVM_DIR/nvm.sh" >>/etc/profile RUN bash -ic "nvm install $NODE_DEFAULT_VERSION && nvm use $NODE_DEFAULT_VERSION" RUN chmod -R ugo+w $NVM_DIR

The code above sets an environment variable for default node version (v14.2.0), which is the node version the theme uses at the time of this setup. It installs and configures NVM, and makes NVM executable by updating the container's bash profile. A Dockerfile runs while the image or container is being built to alter any default configuration with the code found in the Dockerfile.

Custom DDEV commands

The Drupal 8 theme for this project uses Node for most of its tasks. To compile CSS, JavaScript, and Twig we need to run commands such as npm install, npm run build, npm run watch, and others. Our goal is to be able to run these commands in the web container, not the host computer. Doing this eliminates the need for students to install any node-related tools which can get really complicated. While we could achieve this by asking students to first SSH into the web container (ddev ssh), then navigate into /web/themes/custom/training_theme/, then run the commands, I want to make it even easier for them. I want them to be able to run the commands from any directory within the project and to not have to ssh into the web container. To achieve this we need to create a couple of custom commands.

Custom commands can be created to run in containers as well as the host machine. Since we want to run these commands in the web container, we are going to create the commands inside .ddev/commands/web/. Custom commands are bash script files.

NVM custom command

  • Inside .ddev/commands/web/ create a new file called nvm
  • Add the following code in the file:
#!/bin/bash ## Description: Run any nvm command. ## Usage: nvm [flags] [args] ## Example: "nvm use or nvm install" source /etc/profile && cd /var/www/html/web/themes/custom/training_theme && nvm $@
  • Since this is a bash script, #!/bin/bash is required as the first line in the file.
  • A description of the script is a good practice to explain what the script does.
  • Pay close attention to ## Usage: nvm [flgas] [args]. This is what makes the commands work. The [flags] and [args] are ways to pass arguments to the command. For example, nvm on its own won't do much, but use or install can be passed as parameters to complete the commands nvm use or nvm install. Being able to run these commands will allow us to install new versions of Node later on if needed.
  • Next we are adding examples of potential commands that can be run.
  • Finally, you see the code or actual commands. source /etc/profile is basically resetting the container's bash profile so NVM can run. Then we navigate into the training_theme directory within the container where the nvm commands will be executed. So technically in the script above we are running 3 commands in one. Using && in between each command lets us combine or concatenate them. The $@ after nvm represent the flags or arguments we can pass (i.e. use or install).

Running the new commands: Every custom command needs to be executed by adding ddev before the command. For example: ddev nvm use. Using ddev infront of the command instructs the system to run the commands in the containers, rather than the host computer.

NPM custom commands

We will also create a custom script to run NPM commands. This will be similar to the NVM script. This new script will be executed as ddev npm install or ddev npm run build, etc. The npm commands will allow us to install node dependencies by the theme as well as execute tasks like compiling code, linting code, compressing assets, and more.

  • Create a new file inside .ddev/commands/web and call it npm
  • Add the following code in the file:
#!/bin/bash ## Description: Run npm commands inside theme. ## Usage: npm [flags] [args] ## Example: "npm install or npm rebuild node-sass or npm run build or npm run watch" cd /var/www/html/web/themes/custom/training_theme && npm $@
  • Most of the code here is similar to the previous script, except instead of nvm we will run npm.
  • Notice we are again navigating into the theme directory before running the command. This make is possible for the custom commands we are creating to be ran from any directory within our project while still being executed inside the training_theme directory in the container.

Drush custom command

Let's create one last custom command to run drush commands within the DDEV containers

  • Create a new file inside .ddev/commands/web and call it drush
  • Add the following code in the file:
#!/bin/bash ## Description: Run drush inside the web container ## Usage: drush [flags] [args] ## Example: "ddev drush uli" or "ddev drush sql-cli" or "ddev drush --version" drush $@
  • This will allow us to run drush commands in the container but using ddev drush <command> (i.e. ddev drush cr, ddev drush updb -y, etc.).

So that's it for custom commands. By having custom commands for nvm and npm, we can now successfully run any theming related tasks.

Automating Drupal's setup

We want to streamline the drupal installation process. In addition, we want to be able to import a custom database file to have access to all the infrastructure needed during training. This includes content types, paragraph types, views, view modes, image styles, and more. Enter DDEV hooks.

DDEV Hooks

Hooks are a great way to perform tasks before or after DDEV starts. There are tasks that need to happen in specific sequence and hooks allow us to do just that. So what's the differnce between custom commands and hooks? Technically hooks can be considered custom commands, but the difference is that they are executed automatically before or after DDEV starts, whereas custom commands are ran on demand at any time. DDEV needs to be running if custom commands are intended to run in containers. Back to hooks, Let's build a post-start hook.

Hooks can be ran as pre-start, post-start, and after-db-import. Also, hooks can be executed inside containers and/or the host machine. In our case all the tasks we outlined above will be ran after DDEV starts and most of them in Docker's containers.

  • Open .ddev/config.yaml and add the following code at the bottom of the file. There may already be a hooks section in your file. Be sure indentation in the file is correct.
hooks: post-start: - composer: install - exec: /var/www/html/db/import-db.sh - exec: drush updb -y - exec: drush cim -y - exec-host: cp -rf web/sites/example.development.services.yml web/sites/development.services.yml - exec-host: cp -rf web/assets/images/* web/sites/default/files/images/ - exec: drush cr - exec-host: ddev launch /user
  • post-start: indicates tasks declared will run after DDEV container's have started. We want the web and db containers available before running any of the tasks.
  • composer: install will download all dependencies found in composer.json (Drupal core, modules, drush and others).
  • exec: /var/www/html/db/import-db.sh is a custom script which imports a custom database file (provided in this project). We will go over this script shortly. Importing a custom database builds the Drupal site without having to install Drupal as long as Drupal's codebase exists.
  • exec: drush cim -y, exec: drush updb -y, and exec: drush cr, are basic drush commands.
  • exec-host: cp -rf web/sites/example.development.services.yml web/sites/development.services.yml will create a new development.services.yml. This is needed because when Drupal is setup, development.services.yml is overridden and since we are using custom configuration in that file to enable twig debugging, we need to restore the configuration by replacing the file with a copy of our own.
  • exec-host: cp -rf web/assets/images/* web/sites/default/files/images/ copies a collection of images used in the demo content added to the site.
  • Finally, exec-host: ddev launch /user will open a fully configured Drupal website in the browser.

Login to Drupal: Username: admin, password: admin

Import database script

Now let's write the script to import the database we are using in the hook above.

  • In your project's root, create a new directory called db
  • In the new directory create a new file called import-db.sh and add the following code:
#!/bin/bash # Use a table that should exist in your database. if ! mysql -e 'SELECT * FROM node__field_hero;' db 2>/dev/null; then echo 'Importing the database' # Provide path to custom database. gzip -dc /var/www/html/db/drupaltraining.sql.gz | mysql db fi
  • This is again another bash script which performs a database import but only if the database is empty or clean. We do this by checking if one of the tables we expect in the database exists (node__field_hero). If it does, the database is not imported, but if it doesn't it will import the database. This table can be any table you know it should exist.

  • Notice the script calls for a database file named drupaltraining.sql.gz. This means the database file shold exist inside the db directory alongside the import-db.sh script. This database file was created/exported after Drupal was configured with all training required infrastructure and settings.

  • Make the script executable

chmod +x db/import-db.sh
  • This will ensure the script can be executed by DDEV, otherwise we would get a permission denied error.

This does it for automation. The next few tasks are things that improve the development environment. Some of these tasks are optional.

Exposing Pattern Lab's port in host computer

Since our goal is to not have to install any Front-End tools in the host computer, running Pattern Lab has to be done in the web container. The problem is we can't open Pattern Lab in the host's browser if Pattern Lab is running in the container. For this to work we need to expose the port in which Pattern Lab runs to the host machine. In this environment, that port is 3000 (this port number may vary). We identify this port by running npm run watch inside the training_theme directory. This will provide a series of links to access Pattern Lab in the browser.

Under the hood, DDEV uses docker-compose to define and run the multiple containers that make up the local environment for a project. docker-compose supports defining multiple compose files to facilitate sharing Compose configurations between files and projects, and DDEV is designed to leverage this ability.

Creating a new docker-compose.*.yaml

A docker-compose file allows to do many things including exposing ports from the containers to the host computer.

On a typical Pattern lab project if you run npm start you will see Pattern Lab running on http://localhost:3000. In this environment the equivalent command is ddev npm run watch. Since this environment is running Pattern Lab in the web container, the only way to access Pattern Lab in the browser is by having access to the container's port 3000. Exposing the port via the http or https protocols makes it possible to access Pattern Lab's UI page from the host machine.

  1. Inside .ddev/, create a new file called docker-compose.patternlab.yaml

  2. In the new file add the following code:

    # Override the web container's standard HTTP_EXPOSE and HTTPS_EXPOSE # to access patternlab on https://drupaltraining.ddev.site:3000, or http://drupaltraining.ddev.site:3001. version: '3.6' services: web: # ports are a list of exposed *container* ports ports: - "3000" environment: - HTTP_EXPOSE=${DDEV_ROUTER_HTTP_PORT}:80,${DDEV_MAILHOG_PORT}:8025,3001:3000 - HTTPS_EXPOSE=${DDEV_ROUTER_HTTPS_PORT}:80,${DDEV_MAILHOG_HTTPS_PORT}:8025,3000:3000
    • The name of the file is completely optional. It makes sense to use a name that is related to the action, app, or service you are trying to implement. In this example the name docker-compose.patternlab.yaml made sense.
    • How did we arrive at the content above for this file? DDEV comes with two files that can be used as templates for new configuration, one of those files is called .ddev-docker-compose-base.yaml. You can find all the code we added above in this file.
  3. Restart DDEV to allow for the new changes to take effect:

    ddev restart
    • The basics of the code above is modifying the web container's port 3000. We are exposing this port through the http and https protocols on the host machine.
  4. For the above to work Pattern Lab needs to be running in the container:

    ddev npm run watch

Viewing Pattern Lab in the host machine

Using NFS (optional)

Last but not least, enabling NFS in DDEV can help with performance of your application. NFS (Network File System) is a classic, mature Unix technique to mount a filesystem from one device to another. It provides significantly improved webserver performance on macOS and Windows. This is completely optional and in most cases you may not even need to do this. The steps below are for macOS only. Learn more about NFS and how to enable it in other Operating Systems like Windows.

  • In your command line run
id
  • Make note of uid (user id) and gid (group id).
  • Open etc/exports in your code editor, and add the following code, preferably at the top of the file:
/System/Volumes/Data/Users/xxxx/Sites/Docker -alldirs -mapall=502:20 localhost
  • Replace xxxx with your username. The full path shown above is required if you are using macOS Catalina.

  • Replace Sites/Docker (This is my personal project's directory), with the directory name where your DDEV projects are created. Most people would mount the entire user's home directory, but I think only mounting the directory where your projects live is good enough (/Sites/Docker/).

  • Replace 502 and 20 with the values you got when you ran the id command above.

  • Update DDEV's config.yml to enable NFS

nfs_mount_enabled: true
  • Restart DDEV
ddev restart

You should notice an improvement in performance in your Drupal website.

In closing

I realize there is a lot here, but I am pretty happy with how this turned out. Thanks to all the work in this article, when one of the students wants to setup their training environment, all the have to do is run ddev start. I think that's pretty sweet! 🙌 Happy DDEVing!

Giving credit

Before I started on this journey, I knew very little about DDEV. Thanks to the amazing help from Randy Fray from Drud.com and Michael Anello from DrupalEasy.com, I have learned a lot in the past weeks and wanted to share with other community members. You can find both, Randy and Michael on Drupal's DDEV Slack Channel. They are extremely helpful and responsive.

Resources

Mario Hernandez: Running a training workshop

Update 1-10-19
I wrote an extended version of this post at Mediacurrent's blog, check it out.

As long as I can remember I've enjoyed public speaking. This doesn't mean I am good at it, it simply means I enjoy it. School events, class president, my jobs, etc., they all taught me great lessons about public speaking. So when I started as a developer, sharing my knowledge with others at conferences or meetups came pretty natural.

I'd like to clarify, that after years of doing talks and other methods of public speaking, I am still terrified. I get nervous, my hands sweat, my legs shake, and my voice gets weird. Basically what I am trying to say is that I'm not an expert by any means, but I overcome the phobia of public speaking by doing it frequently.

For many years I have speaking at conferences, but in the past few years I started conducting longer workshops. I first started doing online workshops, which have their pros and cons. While they don't put you face to face with your audience, it also does not give you a good sense for how effective your training is because you can't see people reactions. For this reason I prefer to do face-to-face training.
As part of my job I conduct periodic Front-End training workshops for clients and recently I started conducting all-day training workshops at conferences. I really enjoy it and I'ld like to share some of the lessons learned.

Picking a topic

Ideally you want to pick a topic you feel 100% confident about. I have learned that people attending your training or talks welcome any information you can share no matter how simple or elementary it may feel to you. Don't ever think what you know may not be of interest to others because you would be wrong.

Lately I have been challenging myself a little more when picking a topic to train about. While is good to know the topic well, it is also extremely rewarding to pick a topic you'd like to learn more about. This may feel contrary to what I said ealier but hear me out. When you decide to train on a topic, you will spend a lot of time preparing, training, testing and reharsing. This is exactly how you learn a new skill. I can't tell you how many times I come out of training I did knowing more about the topic than before and also learning from people who attended the training. If you want to learn a new skill, teaching others about it could be the best way for you to learn it.

Preparing for the training

Everyone has their own style for teaching or doing a presentation. Some people like to use slides and screenshots, others show recordings of their project or code. My personal preference is to build a working prototype. This to me presents many advantages, but it also means you will spend more time getting ready.
My training workshops usually include very little slides because the majority of the training will be spent writing actual code and building the prototype during the training.

Here's my typical process for preparing for a training workshop:

  • Identify a prototype that serves the purpose of the training. If I am teaching a workshop about component based development I would normally pick something that involves the different aspects of component based development (attoms, nested components, reusable components, etc.)

  • Build the prototype upfront to ensure you have a working model to demo and go by.

  • Once prototype is built, create a public repo so you can share the working prototype

  • Write step-by-step instructions to building the prototype. Normally I would break the prototype down into small components, atoms.

  • Test, test and test. You want to make sure yoru audience will not run insto unexpected issues while following your instructions. For this reason you need to make sure you test your instructions. Ask a friend or colleague to go through each of your excercises to ensure thing work as expected.

  • Provide a pre-training evaluation. A quick set of questions that will give you an idea of people's skills level as well as environment (Linux, Widows, OSX). This will help you plan ahead of time.

  • Build a simple slide deck for introductions and agenda purposes. Mainly I move away from slides as soon as introduction and agenda is done. The rest of the training is all hands on.

Communicate with your audience ahead of time

As you will learn, one thing that can really kill a lot of the time during training is assisting people with their local environment setup. I have conducted training workshops where I've spent half the time helping people with their environment. For this reason, nowadays I communicate with the people ahead of time to ensure everyone's local environment is ready to go.

I normally make myself available once or twice in an evening through a google hangout to assit anyone who may need help. I also provide detailed instructions on how to get their local environment ready. This could save you a lot of time during training. In addition, for those who did spend the time on getting their environment ready, it's not fair that they have to be held back because someone did not make an effort to setup their environment.
I make myself available ahead of training but if someone is still having issues because of neglect, I don't hold the rest of the class back. I try to help them but at some point I move on.

During training

If possible, get help from someone who is also well-versed with the topic so they can assist you help people who may get stuck. Nothing is more frustrating that havign to break the flow of the training to help people who get stuck. Having someone else help you with this allows you to continue with the training and not have everyone loose momentum.

Finally

Enjoy yourself. Make sure you and your audience have fun. If you show excitement in what you are doing people will get excited as well.

LakeDrops Drupal Consulting, Development and Hosting: ECA 2.0.0 has been released for Drupal 10.3 and 11

ECA 2.0.0 has been released for Drupal 10.3 and 11 Image removed.JĂĽrgen Haas Wed, 19.06.2024 - 10:41

Almost 2 years ago, ECA 1.0.0 was published, and a lot happened in the 23 months in between. Today, ECA gets its first major update which comes not only with a ton of new features but also with code clean-up, performance improvements and support for the latest Drupal core releases 10.3 and soon 11.

ComputerMinds.co.uk: My text filter's placeholder content disappeared!

Image removed.

A story of contributing a fix to Drupal... and a pragmatic workaround

When I upgraded a site from Drupal 10.1 to 10.2, I discovered a particularly serious bug: the login form on our client's site vanished ... which was pretty serious for this site which hid all content behind a login!

We had a custom text format filter plugin to render the login form in place of a custom token in text that editors set, on one of the few pages that anonymous users could access. Forms can have quite different cacheability to the rest of a page, and building them can be a relatively expensive operation anyway, so we used placeholders which Drupal can replace 'lazily' outside of regular caching:

class MymoduleLoginFormFilter extends FilterBase implements TrustedCallbackInterface { public function process($text, $langcode) { $result = new FilterProcessResult($text); $needle = '[login_form]'; // No arguments needed as [login_form] is always to be replaced with the same form. $arguments = []; $replace = $result->createPlaceholder(self::class . '::renderLoginForm', $arguments); return $result->setProcessedText(str_replace($needle, $replace, $text)); } public static function renderLoginForm() { // Could be any relatively expensive operation. return \Drupal::formBuilder()->getForm(UserLoginForm::class); } public static function trustedCallbacks() { return ['renderLoginForm']; } }

But our text format also had core's "Correct faulty and chopped off HTML" filter enabled - which completely removed the placeholder, and therefore the form went missing from the final output!

Debugging this to investigate was interesting - it took me down the rabbit hole of learning more about PHP 8 Fibers, as Drupal 10.2 uses them to replace placeholders. Initially, I thought the problem could be there, but it turned out that the placeholder itself was the problem. Drupal happily generated the form to go in the right place, but couldn't find the placeholder. Here's what a placeholder, created by FilterProcessResult::createPlaceholder() should look like:

<drupal-filter-placeholder callback="Drupal\mymodule\Plugin\Filter\MymoduleLoginFormFilter::renderLoginForm" arguments="" token="hqdY2kfgWm35IxkrraS4AZx6zYgR7YRVmOwvWli80V4"></drupal-filter-placeholder>

Looking very carefully, I spotted that the arguments="" attribute in the actual markup was just arguments - i.e. it had been turned into a 'boolean' HTML attribute:

<drupal-filter-placeholder callback="Drupal\mymodule\Plugin\Filter\MymoduleLoginFormFilter::renderLoginForm" arguments token="hqdY2kfgWm35IxkrraS4AZx6zYgR7YRVmOwvWli80V4"></drupal-filter-placeholder>

There is a limited set of these, and yet the masterminds/html5 component that Drupal 10.2 now uses to process HTML 5 requires an explicit list of the attributes that should not get converted to boolean attributes when they are set to an empty string.

At this point, I should point out that this means a simple solution could be to just pass some arguments so that the attribute isn't empty! That is a nice immediate workaround that avoids the need for any patch, so is an obvious maintainable solution:

// Insert your favourite argument; any value will do. $arguments = [42];

At least that ensures our login form shows again!

Image removed.

But I don't see any documentation saying there must be arguments, and it would be easy for someone to write this kind of code again elsewhere, especially if we're trying to do The Right Thing by using placeholders in filters.

So I decided to contribute a fix back to Drupal core. I've worked on core before. Sometimes it's a joy to find or fix something that affects thousands of people, other times the contribution process can be soul-destroying. At least in this case, I found an existing test in core that could be easily extended to demonstrate the bug. Then I wrote a surgical fix... but I can see that it tightly couples the filter system to Drupal's HtmlSerializerRules class. That class is within the \Drupal\Component namespace, which is described as:

Drupal Components are independent libraries that do not depend on the rest of Drupal in order to function.

Components MAY depend on other Drupal Components or external libraries/packages, but MUST NOT depend on any other Drupal code.

So perhaps it needs configuration in order to be decoupled; and/or a factory service; or maybe modules should subscribe to an event to be able to inject their own rules .... and very quickly perfection feels like the enemy of good, as I can imagine the scope of a solution ballooning in size and complexity. 

I'm all for high standards in core, but fulfilling them to produce solutions can still be a slow and frustrating experience. I'm already involved in enough long-running issues that just bounce around between reviewers, deprecations and changes in standards. I risk just ranting here rather than providing answers - and believe me, I'm incredibly grateful for the work that reviewers and committers have put into producing Drupal - but surely the current process must be putting so many potential contributors off. We worry about attracting talent to the Drupal ecosystem, and turning Takers into Makers, but what are they going to find when they arrive? Contributing improvements of decent size is hard and can require perseverance over years. Where can we adjust the balance to make contribution easier for anyone, even seasoned developers?

As I suggested, perhaps this particular bug needs any of a factory pattern, event subscriber, or injected configuration... but what would my next step be? I'm reluctant to put effort into writing a more complex solution when I know from experience that reviewers might just suggest doing something different anyway. At least I have that simple (if unsatisfying) workaround for the filter placeholder method: always send an argument, even if it might be ignored. I guess that reflects the contribution experience itself sometimes!

Wim Leers: XB week 5: chaos theory

We started the week off with a first MVP config entity: the component config entity. We have to start somewhere and then iterate — and iterate we can: thanks to Felix “f.mazeikis”, #3444417 landed! Most crucially this means we now have both a low-level unit test and a kernel test that ensures the validation for this config entity is thorough — of course I’m bringing config validation to Experience Builder (XB) from the very start!

It doesn’t do much: it allows “opting in” single directory components (SDCs) to XB, and now only those are available in the PoC admin UI. Next up is #3452397: expanding the admin UI and the config entity to allow defining default values for its SDC props. This will allow the UI to immediately generate a preview of the component, which otherwise may (depending on the component) not render anything visible (imagine: a <h2> without text, an <img> without an image, etc.).

But literally everything about this config entity is subject to change:

  • its name (issue: #3455036)
  • the whole “opting in” principle (issue comment: #3444424-7
  • what it does exactly (issue: #3446083)
  • whether it should handle only exposing SDCs 1:1 in XB, or whether it should also handle other component types (issue: #3454519)

How did we get here?

Which brings me to the main subject for this week: I have a theory that explains the chaos that I think now many of you see.

We rushed to open the 0.x branch to invite contribution, because Dries announced XB at DrupalCon. This wrongly gave the impression that there was a very detailed plan, with a precise architecture in mind. That is not the case, and mea culpa for giving that impression. That branch was simply the combination of different PoCs merged together.

Some additional context: back in March, from one day to the next, I along with Ben “bnjmnm”, Ted “tedbow”, Alex “effulgentsia”, Tim and Lauri, plus Felix and Jesse (both from the Site Studio team, both very experienced) I were asked to stop everything we were doing (for me: dropping config validation) and start estimating the product requirements.

~4 weeks of non-stop grueling meetings 1 to assess and estimate the 64 requirements, with Lauri providing visual examples and additional context for the product requirements he crafted (after careful research: #3452440). We looked at multiple radical options and evaluated which of these was viable:

  1. building on top of one of the page building solutions that already exist in the Drupal ecosystem
  2. partnering with one of the existing page building systems, if they were willing to relicense under GPL-2+ and we’d be able to fit them onto Drupal’s data modeling strengths
  3. building something new (but reusing the parts in the Drupal ecosystem that we can build on top of, such as SDC, field types for data modeling, etc.)

The result: 60 pages worth of notes 2, with for each of the 42 “Required for MVP” product requirements (out of a total of 64), there now was an outline of how it could be implemented, a best+realistic+worst case estimate (week-level accuracy), which skillsets needed and how many people. It’s during this process that it became clear that only one choice made sense: the last one. That was the conclusion we reached because the existing solutions in the ecosystem fall short (see #3452440), Site Studio does not meet all of XB’s requirements and some of its architectural choices conflict with XB’s requirements and we did not end up finding a magical unicorn that meshed perfectly with Drupal.

It’s during this time that I made my biggest mistake yet: because the request for estimating this was coming from within Acquia, I assumed this was intended to be built by a team of Acquians. Because … why else create estimates?! If it’s built by a mix of paid full-time, paid part-time and volunteer community members, there’s little point in estimating, right?!
Well, I was wrong: turns out the intent was to get a sense of how feasible it was to achieve in roughly which timeline. But I and my fellow team members were so deep into this enormous estimation exercise based on very high-level requirements that thinking about capturing this information in a more shareable form was a thought that simply did not occur…

So, choice 3 it was, with people that have deep Layout Builder knowledge (Ted & Tim), while bringing in expertise from the Site Studio team (Jesse & Felix) and strong front-end expertise (Ben & Harumi “hooroomoo”) … because next we were asked to narrow the estimates for the highest-variance estimates, to bring it down to a higher degree of confidence. That’s where the “FTEs”, “Best Case Size (weeks)”, “Realistic Case Size (weeks)”, “Worst Case Size (weeks)”, “Most Probable Case Size (calculated, FTE weeks)”, “Variance” columns in the requirements spreadsheet come in.

For the next ~4 weeks, we built PoCs for the requirements where our estimates had either the highest variance or the highest number, to both narrow and reduce the estimates. That’s where all the branches on the experience_builder project come from! For example:

Week 1 = witch pot

After the ~8 chaotic weeks above, it was DrupalCon (Dries also wrote about this, just before DrupalCon, based on the above work!), and then … it was week 1.

In that first week, we took a handful of branches that seemed to contain the pieces most sensible to start iterating on an actual implementation, threw that into a witch pot, and called that 0.x!

Now that all of you know the full origin story, and many of you have experienced the subsequent ~4 equally chaotic weeks, you’re all caught up! :D

Missed a prior week? See all posts tagged Experience Builder.

Goal: make it possible to follow high-level progress by reading ~5 minutes/week. I hope this empowers more people to contribute when their unique skills can best be put to use!

For more detail, join the #experience-builder Slack channel. Check out the pinned items at the top!

Going forward: order from chaos

So, how are we going to improve things?

Actual progress?

So besides the backstory and attempts to bring more order & overview, what happened?

The UI gained panning support (#3452623), which really makes it feel like it’s coming alive:

Image removed. Try it yourself locally if you like, but there’s not much you can do yet.
Install the 0.x branch — the “Experience Builder PoC” toolbar item takes you there!

… you can also try an imperfect/limited version of the current UI on the statically hosted demo UI thanks to #3450311 by Lee “larowlan”.

We also got two eslint CI jobs running now (#3450307): one for the typical Drupal stuff, one for the XB UI (which is written in TypeScript) and reduced the overhead of PHPStan Level 8 (again thanks to Lee, in #3454501) … and a bunch more low-level things that were improved.

Various things are in progress … expect an update for those next week :)

Thanks to Lauri for reviewing this!

  1. Jesse Baker aptly labeled this period as “bonding through trauma” — before the meetings started mid-March we didn’t know each other, and afterwards it felt like I’d worked with Felix & Jesse for years! ↩︎

  2. In a form that is not sufficiently digestible for public consumption. ↩︎

DrupalEasy: Two very different European Drupal events in one week

Image removed.

I was fortunate enough to attend two very different European Drupal events recently, and wanted to take a few minutes to share my experience as well as thank the organizers for their dedication to the Drupal community.

The events couldn't have been more different - one being a first-time event in a town I've never heard of and the other celebrating their 20th anniversary in a city that can be considered the crossroads of the Netherlands.

DrupalCamp Cemaes

First off - it is pronounced KEM-ice (I didn't learn this until I actually arrived). Cemaes is a small village at the very northern tip of Wales with a population of 1,357. Luckily, one of those 1,357 people is Jennifer Tehan, an independent Drupal developer who decided to organize a Drupal event in a town that (I can only assume) has more sheep than people. 

Image removed.

When this event was first announced a few months back, I immediately knew that this was something I wanted to attend - mostly because I've always wanted to visit Wales to do some hiking ("walking," if you're not from the United States.) I used the camp as an excuse for my wife and I to take a vacation and explore northern Wales. I could write an entire other blog post on how amazeballs the walking was…

Image removed.

DrupalCamp Cemaes was never going to be a large event, and it turns out there wound up being only nine of us (ten if you count Frankie, pictured below.) The advantage of such a small event was easy to see - we all got to know each other quite well. There's something really nice - less intimidating - about small Drupal events that I realized that I've really missed since the decline of Drupal meetups in many localities. 

Image removed.

The event itself was an unconference, with different people presenting on Drupal contributions (James Shields), GitHub Codespaces (Rachel Lawson), remote work (Jennifer Tehan), the LocalGov distribution (Andy Broomfield), and Starshot (myself.)

The setting of the event couldn't have been more lovely - an idyllic seaside community where everything was within walking distance, including the village hall where the event took place. If Jennifer decides to organize DrupalCamp Cemaes 2025, I'm pretty sure Gwendolyn and I will be there.

Read a brief wrap-up of DrupalCamp Cemaes from Jennifer Tehan. 

Image removed.

This camp was proof that anyone, anywhere can organize a successful Drupal event with a minimum of fuss. 

Drupaljam

Four days later, I was in Utrecht, Netherlands, at the 20th anniversary of Drupaljam, the annual main Dutch Drupal event. 

Image removed.

I had previously attended 2011 Drupaljam and recall two things about that event: the building it took place in seemed like it was from the future, and many of the sessions were in Dutch.

This was a very different event from the one I had been at a few days earlier, in a city of almost 400,000 people with over 300 attendees (I was told this number anecdotally) in one of the coolest event venues I've ever been to. Defabrique is a renovated linseed oil and compound feed factory that is a bit over 100 years old that is (IMHO) absolutely perfect for a Drupal event. Each and every public space has character, yet is also modern enough to support open-source enthusiasts. On top of all that, the food was amazing.

Attending Drupal Jam was a bit of a last minute decision for me, but I'm really glad that I made the time for it. I was able to reconnect with members of the European Drupal community that I don't often see, and make new connections with more folks that I can count.

I spent the day the way I spend most of my time at Drupal events - alternating between networking and attending sessions that attracted me. There were a number of really interesting AI-related sessions that I took in; it's pretty obvious to me that the Drupal community is approaching AI integrations in an unsurprisingly thoughtful manner.

The last week reinforced to me how fortunate I am to be able to attend so many in-person Drupal events. The two events I was able to participate in couldn't have been more different in scale and scope, but don't ask me to choose my favorite, because I'm honestly not sure I could!
 

Nonprofit Drupal posts: June Drupal for Nonprofits Chat

Join us THURSDAY, June 20 at 1pm ET / 10am PT, for our regularly scheduled call to chat about all things Drupal and nonprofits. (Convert to your local time zone.)

We don't have anything specific on the agenda this month, so we'll have plenty of time to discuss anything that's on our minds at the intersection of Drupal and nonprofits.  Got something specific you want to talk about? Feel free to share ahead of time in our collaborative Google doc: https://nten.org/drupal/notes!

All nonprofit Drupal devs and users, regardless of experience level, are always welcome on this call.

This free call is sponsored by NTEN.org and open to everyone. 

  • Join the call: https://us02web.zoom.us/j/81817469653

    • Meeting ID: 818 1746 9653
      Passcode: 551681

    • One tap mobile:
      +16699006833,,81817469653# US (San Jose)
      +13462487799,,81817469653# US (Houston)

    • Dial by your location:
      +1 669 900 6833 US (San Jose)
      +1 346 248 7799 US (Houston)
      +1 253 215 8782 US (Tacoma)
      +1 929 205 6099 US (New York)
      +1 301 715 8592 US (Washington DC)
      +1 312 626 6799 US (Chicago)

    • Find your local number: https://us02web.zoom.us/u/kpV1o65N

  • Follow along on Google Docs: https://nten.org/drupal/notes

View notes of previous months' calls.

The Drop Times: Christina Lockhart on Elevating Drupal: The Essential Role of Women in Tech

Discover how women are transforming the Drupal community and driving innovation in technology. In this interview, Christina Lockhart, Digital Marketing Manager at the Drupal Association, shares her journey, her key responsibilities, and the vital role women play in advancing Drupal. Learn about her strategies for promoting DrupalCon, her insights on overcoming barriers for women in tech, and the importance of empowering female leaders in the industry.

Specbee: Getting started with integrating Drupal and Tailwind CSS

If you’re tired of spending hours tweaking CSS stylesheets to get your website looking just right, then you should consider a new way of CSS’ing (yeah, we just made it up). We’re talking about Tailwind CSS - a different way of doing CSS. It uses a utility-first approach, which means instead of creating a class for a component, let’s say a button, you would use a pre-defined class that’s built to style the button, like a “font-bold” or a “text-white”. All of this right from the comfort of your HTML!  Couple Tailwind CSS, a sleek, utility-based open-source CSS framework, with a modular, highly customizable, and modern CMS like Drupal 10, and you have a powerhouse combination!  In this article, we’ll be talking about Drupal and Tailwind CSS AND how you can integrate them seamlessly to build modern, responsive, and visually stunning websites Why Drupal is a good choice for web development Drupal is an open-source content management system (CMS) that has been gaining popularity among web developers for its versatility and powerful features. The latest version, Drupal 10 brings a host of new features and improvements meant to simplify development and enhance user experience. 1. Highly customizable design: One of the biggest advantages of using Drupal is its highly customizable design. With a wide range of themes and templates available, developers have the freedom to create unique and visually appealing websites that cater to their client's specific needs. 2. Flexibility and scalability: Drupal's modular architecture makes it incredibly flexible and scalable, making it suitable for building websites of any size or complexity. It offers a vast library of modules that can be easily integrated into your website, providing additional functionality such as e-commerce capabilities, social media integration, multilingual support, and more. 3. Robust security measures: Security is a major concern in today's times; hence it is crucial to choose a CMS that prioritizes it. Drupal has always been recognized as one of the most secure CMS platforms due to its stringent security standards and frequent updates that address any potential vulnerabilities promptly.  4. SEO-friendly: Search Engine Optimization (SEO) is vital for every website's success as it determines its visibility on search engines like Google. Drupal's built-in SEO tools assist developers in optimizing their websites by generating search engine-friendly URLs, meta tags, and sitemaps, and providing customizable page titles and descriptions. 5. Active community support: Drupal benefits from a substantial and engaged group of developers who contribute to its ongoing evolution and offer support to fellow users. This community-driven approach ensures that any issues or bugs are addressed promptly, and new updates and features are released regularly. Why Tailwind CSS is great for web developers Tailwind CSS has gained immense popularity in the web development community due to its numerous benefits and advantages.  1. Highly customizable: One of the biggest advantages of Tailwind CSS is its high level of customization. Unlike other popular front-end frameworks like Bootstrap or Foundation, Tailwind does not come with predefined styles and components. Instead, it provides a comprehensive set of utility classes that can be used to create custom designs according to your specific project needs. This allows developers to have complete control over their designs and enables them to create unique and personalized user experiences. 2. Lightweight: Another major benefit of using Tailwind CSS is its lightweight nature. Since it only generates the required classes based on your design choices, it helps keep the file size small and improves website performance. This is especially beneficial for Drupal websites which often have heavy content and functionality that can slow down page loading speed. 3. Mobile-responsive: With more people accessing websites through mobile devices, having a mobile-responsive design has become crucial for any website's success. The use of Tailwind CSS makes it easier to create responsive designs without adding extra code or media queries. Its mobile-first approach ensures that your website looks great on all screen sizes without compromising on performance. 4. Faster development process: By eliminating the need for manually writing complex CSS rules, Tailwind speeds up the development process significantly. It also reduces the chances of errors as developers can easily reuse existing utility classes instead of creating new ones from scratch every time they need a similar style element. 5. Scalable designs: With Tailwind CSS, you can create scalable designs that are easy to maintain and modify in the long run. As your website grows and evolves, making changes to its design becomes more manageable with Tailwind's utility classes, as opposed to traditional frameworks where modifying styles can be a tedious process. Why Integrate Drupal with Tailwind CSS? One of the main reasons to integrate Drupal with Tailwind CSS is its ability to simplify web development. With Tailwind CSS, developers can avoid writing repetitive and lengthy code for styling. Instead, they can use pre-built classes that provide consistent and reusable design patterns. This not only saves time but also results in cleaner code that is easier to maintain. Another benefit of integrating Drupal with Tailwind CSS is its responsive design capabilities. With the increasing use of mobile devices, it has become essential for websites to be optimized for all screen sizes. Thanks to Tailwind's built-in responsive utilities, designers can easily build responsive layouts without having to write separate media queries for different devices. Step-by-Step Guide to Integrating Drupal with Tailwind CSS Step 1: Get your prerequisites ready Local Drupal Setup NPM and Yarn package Managers Drush Lando Step 2: Create a custom theme We will begin by creating a Drupal custom theme and then will integrate Tailwind CSS into the same. In this tutorial, I am using the starter kit theme command to generate my custom theme. Go to the web directory and type in the following command in the terminal php core/scripts/drupal generate-theme drupal_tailwind_theme after pressing the enter key our custom theme will be generated as shown below: Next, we will generate the package.json file which we require to initialize a new node.js project. Go to the terminal and type in the following command npm init and it prompts us with a series of questions regarding our project since for this tutorial I am keeping everything default by pressing the enter key until our package.json file is added as shown in the image. Step 3: Install and set up Tailwind CSS Now let’s install Tailwind and its dependencies and also configure it. In the terminal, run the following command:npm install -D tailwindcss postcss autoprefixer Next, let’s generate a Tailwind CSS config file using the following command:npx tailwindcss init -p Go to the tailwind.config.js file and set the path for Tailwind to scan. Since our Twig templates are in the templates directory, we will use that path as shown in the image. Now we need to create a CSS file to put the Tailwind directives in. Go to the src folder, create a file named index.css, and paste the following directives:@tailwind base;@tailwind components;@tailwind utilities;   Create a folder where the Tailwind CSS will be compiled. On the same level as the src folder, create a folder named dist (or any name you prefer). We now run the following command:npx tailwindcss –input src/index.css –output dist/index.css After executing this command, we will be able to see the compiled version of the Tailwindcss in the dist folder. We can verify it by going to the dist folder as shown in the image. In the package.json file, we will be adding the scripts dev and watch which can be executed using npm run dev & npm run watch or using yarn dev & yarn watch. The script dev compiles the code once as per command while the watch command keeps a watch on any change. Step 4: Adding Tailwind CSS to our Drupal Theme Now we will be adding Tailwind to our Drupal site. For that, we can add a library namely tailwind, and give it a CSS path of the dist/index.css. That means where we have our compiled tailwind. Next, we can add that library either globally or to the specific template. For the purpose of this tutorial, I will be adding it globally. Using Tailwind CSS in our Drupal theme For this tutorial, I am using the page.html.twig file and adding a red background color by including the class bg-red-500 in the markup. Clear the cache and check the output. Our Tailwind CSS class has applied the desired background color.   Final Thoughts Tailwind CSS is a great choice when you want to build websites quickly and easily. It integrates well with Drupal and the combination of these two tools creates a powerful synergy for web development. Reach out to us if you would like to learn more about how Specbee’s Drupal experts can help you create stunning web experiences with Drupal and Tailwind CSS.