drupal

Spinning Code: Thoughts on Professional Development

Ongoing professional development is a fact of life for people who want a long-term career in technology. That means finding ways to learn new skills above and beyond what you’re paid to do. We all need to strike a balance between having a life outside of work, and making sure we’re learning new work-related skills. Your professional development should not take over your personal life, but if you only learn on the job you will limit your options over time and be bound to the generosity of your employer.

I changed jobs recently. In the process of leaving my old job I was asked a lot of questions about how I’ve built up a wide-array of skills and knowledge. During those discussions I tried to give the best advice I could on the fly to a question I was asked cold. Now I’ll try to assemble a set of more carefully considered advice.

These suggestions are more or less in the reverse direction of importance: the more you read the more valuable I think the suggestion is.

By “Directed Learning” I mean learning that is supported or encouraged by your employer. This can be certification preparation, employer provided trainings, or research into an emerging technology, even building tools that directly support your work. Into this category goes anything that can lead to higher pay, advancement, or even just keeping your current job.

Some employers are really good about this. Over time I’ve been sent to trainings, had course fees covered, and other direct support. The job I just left even provided financial incentives (cash) for us to acquire select certifications of value to the company. Other employers have published a list of what they wanted people to have – without direct incentive. But most often I’ve had to figure this out on my own which is not ideal of anyone involved.

This type of growth has a few benefits. First, it can help with your reputation (and hopefully pay) with your employer: “Oh, Aaron is certified on [interesting product] we can assign him to [high profile project].” Second, it helps make sure you are growing in alignment with your employer’s needs. That can be helpful when they are forced to make cuts: “We should keep Aaron, he’s the only team member with [rare certification] and Salesforce wants us to have that on the team.”

In my life as a Drupal developer this looked like creating modules for internal use, or learning new techniques. But it boils down to the same thing: learning stuff your employer needs you to know.

The downside of only learning what your employer needs is that you miss chances to grow beyond their needs. That limits your future mobility – your best chances will be for another job like you have today. If you want a better job or a job doing something different, you need skills your employers doesn’t think they care about.

Personal Learning: Side Projects and Other Self-Directed Research

To learn things your employer does not care about you will likely need to put in your own time. You should focus this kind of learning on something that interests you – make it fun. My experience is that this kind of learning goes best with side projects.

Side projects allow us chances to test and practice the ideas we’re learning. I use them to learn programming languages, frameworks, and techniques that interest me but not directly useful to my work. Sometimes they come around to being useful in my work. Sometimes they just give me a chance to learn interesting things. My Github account is full of examples.

An early professional mentor told me to learn a new programming language every year, and so over time I taught myself a bunch of programming languages. These days I generalize that advice a bit and try to learn a new platform, framework, or technique every year – and sometimes a new language. That meant when Salesforce rolled out LWCs to replace Aura Components, I was ready to learn them quickly. It also means when a friend started to talk about using Wails I could related from my experiences using Electron. My experience creating a project estimation tool helps me discuss estimation techniques with project teams.

This blog is an example of a side project. Every post gives me practice at writing. To keep posts accurate I include research as part of my writing process. That research teaches me things I wouldn’t learn otherwise. The act of writing forces me to clarify my thinking on topics. All of that is useful to me and helps me be better at my work. I also know this blog as been read by colleagues, employers, and potential employers all of whom are useful audiences when getting new work.

When I give talks about communication skills I tell people what they do for practice doesn’t matter: what matters is that they practice. Same is true for side projects. What project you tackle isn’t important. What is important is that you do something that forces you to learn new things.

Side projects help us look at our work from different angles, and therefore test our limits and grow.

Develop Learning Routines

It’s easy to get lazy or to put your professional development aside – it feels like there will be time later. So you should create a routine that works in your life.

I like to have a routine to make sure I’m steadily doing some form of professional development – it’s a marathon not a sprint. For me that generally looks like job-related learning for about an hour after my formal work day and working on a side project on weekends. Attaching your learning to your work day allows you to more easily block the time in your mind and with family.

I know people who like to do their study before work because that’s when they are fresh and study gets their brain into the right gear for the day. Other friends tell me they like to do research or study after their kids go to bed because it gives them more time with the most important people in their lives. All of those are great patterns because they are tailored to their life.

Whatever routine works for your life is a good one. The important part is to making learning a regular activity.

Share What You Know

One of the best ways I know to learn is to teach. Giving talks, writing blog posts, and mentoring all give me a chance to share what I know with other people.

Sharing can be intimidating. I have been writing this blog for nearly 10 years. I gave me first conference talk more than 10 years before that. And still, every time I take on a new topic I’m nervous I’ll be found to be a fraud.

I try to leverage the fear I feel to help me assemble the best content I am able (at least within my time and resource constraints). When I am writing, or preparing a talk, I also do a lot of fact checking. Since I never want to put anything into the world that I can’t back up later, I do my best to make sure I’m right. In the process I also learn more about the topic. That even means correcting information I was preparing because my initial understanding was wrong.

On the flip side I try not to let the perfect be the enemy of the good. I require myself to post on my blog once a month – limiting my editing time. Similarly conference talks have a hard deadline. Those deadlines force me to complete content on a schedule, not when it’s perfect. So be it.

When you share what you know people start to see you as an expert. That can lead them to ask you questions that push you to learn more.

All that can lead you to be an expert.

Make Friends

Making friends is easier for some of us than others. And maintaining friendships in our current society is extremely hard. But friendships are super important. They help keep us healthy, mentally fit, and happy. Friends also teach us really important things about the world around us.

If you have read this blog much you are used to seeing me make references to my friends as sources of information. They are my greatest resource for learning (and a bunch of other things too). Friends know stuff we don’t know. Even better, they will tell you things you don’t know. Good friends don’t judge you for your gaps, but are happy to help you fill them.

I value the input and perspective that friends share with me. We don’t have to agree on everything. We don’t have to be interested in the same technologies. In fact both of those conditions help me understand things I wouldn’t know about otherwise.

I cite my friends in my writing because they are important. They deserve credit when they teach me things. Processing ideas with my friends helps me test them for quality. Often they point out errors in my thinking. Those are all good things.

Be Curious

This is one of the most important lessons I learned from my mom. She was a deeply curious person and that rubbed off on my sister and I. We both remember her dragging us to lectures at the RPI Freshwater Institute in Bolton’s Landing, NY (it’s fun to discover those still happen – I highly recommend if you’re in the area). And the time she took us to a ranger talk at a National Park where the ranger offered a postcard to anyone who asked a question he couldn’t answer – she promptly earned all his postcards with questions she genuinely wanted answered. She would stop and talk to keepers at the Philadelphia Zoo to learn more about the animals (a habit my wife and I both now share).

Because of her constant interest in, well just about everything, I have been to historic sites, factory tours, art and science museums, zoos, and many other places. My wife – a curious person herself with her own curious parents – and I continue those kinds of adventures. We go places, we ask questions, and we learn things.

We also read widely, both in our professional fields, and about the world in general. Long drives often involve several hours of listening to podcasts featuring experts in a variety of fields. I’ve previously talked about those as my ongoing Liberal Arts education.

If you aren’t someone who was trained to be curious by your family, it turns out you can do things to increase your curiosity. And yes, I found that article cause I was curious to know if curiosity is trainable – there are deeper articles on curiosity out there too.

The value in being curious is that it causes you to learn things you might otherwise have missed. Some of those things just make you smarter about your world – and the world needs more smart people. Others will suddenly pop up in your work and make you better at your job. They might come up in random conversation with a new friend and deepen your connection to another person.

Learning new things is never a bad idea.

Always Learn More

All this advice boils down to this central idea: always being learning something new. It’s good for your brain and good for your life (oh yeah, and career).

Reading through all of this commentary on professional development you might think: “Wow, that takes a lot of time!” But if you put yourself into a mindset of always trying to learn something new much of that times starts to overlap with other activities in life. And when it’s not all dedicated directly to work it doesn’t burn you out the way constant study of one technology of platform can.

Push yourself forward. Find things that are interesting to you. Nothing new is a waste of your time.

Morpht: Supercharge your Drupal QA: Automating Cypress tests with GitHub actions

This guide explains how to automate your Cypress tests for a Drupal site using GitHub Actions. We’ll cover setting up a full CI environment, connecting a MySQL database, running the Drupal server, and executing end-to-end Cypress tests - all inside GitHub’s cloud runners. You'll learn practical tips to speed up builds with caching and securely handle environment configurations.

The Drop Times: Advanced Features of the Automated Testing Kit - Part 3

In part three of the Automated Testing Kit series, André Angelantoni outlines how teams can grow their automated test suite by shifting test-writing to developers and backfilling the backlog, integrate early accessibility checks with axe-core or Cypress, configure Google Lighthouse for pre-release performance audits, and implement visual regression testing using tools like Diffy.

Salsa Digital: Rules as Code for US cybersecurity

Image removed.The US Executive Order On 6 June 2025, amendments were made to Executive Order 14144 of 16 January 2025 (Strengthening and Promoting Innovation in the Nation’s Cybersecurity).  It included an update to section 7:  “Within 1 year of the date of this order, the Secretary of Commerce, acting through the Director of NIST; the Secretary of Homeland Security, acting through the Director of CISA; and the Director of OMB shall establish a pilot program of a rules-as-code approach for machine-readable versions of policy and guidance that OMB, NIST, and CISA publish and manage regarding cybersecurity.” OMB, NIST and CISA The Office of Management and Budget (OMB) is the government department that helps the US president execute policy objectives.

Nuvole: Implementing fast end-to-end tests on Drupal with Docker and JSON-RPC

A behind-the-scenes look at how we tackled the challenges of end-to-end testing in a Drupal and Next.js project, using Docker, JSON-RPC, and GitHub Actions to streamline the process.

"We'll add the tests later". Most of us have said this at some point in our software development career. But then "later" never arrives.

That is because implementing tests is hard. Setting up the test runners, writing the test cases, keeping them running; all of it requires extra effort. Even more effort is needed for end-to-end tests (E2E) since they touch every part of an application. However, this effort always pays off and in this post, I will share how we implemented end-to-end tests (E2E) on one of our Drupal projects so that you can save some of the effort needed to get started. We are very happy with how this approach turned out and we plan to use it on future projects too.

Meet ILO Live

ILO Live is the International Labour Organization (ILO)'s online video streaming platform. It hosts both live streams and recordings of events organized all across the world by the ILO.

Image removed.

Image removed.

Under the hood, ILO Live is powered by Next.js and a headless Drupal CMS. Most pages on the site are statically served by Next.js.

In this post, I will take you through how we implemented E2E tests on ILO Live.

The challenges of implementing E2E tests

A common approach when writing frontend tests is to use mocked API responses. But with ILO Live, we wanted to go a step further and use the actual CMS in the tests. By using the actual CMS in the tests, in addition to verifying whether the site works as intended, we can also verify whether changes to the CMS have not caused any functionality on the site to break. But to do this, we had to solve 2 major problems.

  1. How do we orchestrate the CMS into different testing states from the test runner?
  2. How do we quickly start the CMS during each test run?

Orchestrating the Drupal CMS using a private JSON-RPC API

The simplest way to orchestrate a Drupal CMS is to have a test runner click on various elements to create new entities and change their state. But this takes a lot of time to run, especially when multiple entities need to be created during each test run.

To solve this problem, we decided to create a private API using the JSON-RPC module. This API would only be enabled on development and testing instances of the CMS and it exposed several operations which the test runner could use to orchestrate the CMS into different states.

For example, the JSON-RPC API exposed a create.event method for creating new events.

{ "jsonrpc": "2.0", "method": "create.event", "params": { "type": "meeting", "title": "Aragorn meets Gandalf", "start_date": "2021-06-01 8:00:00", "end_date": "2021-06-01 17:00:00" }, "id": 1 }

And a clean.content method for resetting the CMS to the initial state with no content set.

{ "jsonrpc": "2.0", "method": "clean.content", "params": {}, "id": 1 }

We added similar methods for creating other entities on the site and for admin tasks such as reindexing the search index. ILO Live used AWS AppSync to receive real-time updates about the current state of events so to simulate this in the tests, we set up a testing instance of AppSync and implemented methods like the update.livestream method shown below to change the state of this instance.

{ "jsonrpc": "2.0", "method": "update.livestream", "params": { "event_id": "1", "status": "live", }, "id": 1 }

Speeding up the Drupal CMS's startup time using Docker

Now we had to get the Drupal CMS to run while the test runner was running. We were familiar with using Cypress for implementing frontend tests so we decided to use the same for implementing the E2E tests. Since the E2E tests ran assertions on elements on the site / frontend, we decided to store the E2E tests within the frontend repo and use Github Actions to run them since the frontend repo was hosted on Github.

We used GNU Make and Docker Compose during development to run the CMS and its services (i.e. Maria DB and Redis) so initially we tried to clone the CMS and run the commands to start it. Docker is preinstalled by default on the ubuntu-latest runner on GitHub Actions so we were able to easily use it there.

# Simplified Github actions workflow for running the E2E tests name: Tests on: push jobs: cypress: name: Run Cypress tests runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 # Clone the CMS into a subfolder - name: Clone CMS uses: actions/checkout@v4 with: repository: github-org/ilo-live-cms path: ./cms # actions/checkout@v2 only has access to the current repo by default # A custom token needs to be provided to it for it to access a different repo # https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens token: ${{ secrets.CMS_ACCESS_TOKEN }} ref: develop - name: Start CMS run: | cd cms make # More steps to actually run the tests # Simplified Makefile from the CMS default: build install up: @echo "Starting up containers for ilo_live..." docker compose up -d --remove-orphans build: up @echo "Building ilo_live project development environment." docker compose exec -T php bash -c "composer install" install: docker compose exec -T php bash -c "vendor/bin/drush si -y --existing-config --account-pass=\"admin\"" docker compose exec -T php bash -c "vendor/bin/drush deploy" # Simplified docker-compose.yml for the CMS and its services services: mariadb: image: mariadb-image php: image: php-image volumes: - ./:/var/www/html redis: image: redis-image

This setup worked but it took around 2 - 4 minutes to run. Most of this was due to Composer and the Drupal site install.

This delayed each test run so we started looking into ways to improve it. The solution that we came up with was to create a self-contained Docker image for the Drupal CMS. This way, the test runner only needed to pull and run a single image to start the CMS.

To do this, we set up a Github Actions workflow on the CMS repo to build and push a development Docker image. This image had the JSON-RPC API enabled and it used a SQLite database instead of MariaDB.

# Simplied Dockerfile for building the development docker image FROM drupal-image # Replace the site files in the image with our own RUN rm -rf /opt/drupal/* COPY ./ /opt/drupal/ # Install SQLite and dependencies required by composer RUN apt update && apt install -y sqlite3 git zip WORKDIR /opt/drupal RUN composer install # Use custom settings for the development image RUN cp docker/settings.bundle.php web/sites/default/settings.local.php RUN vendor/bin/drush site:install -y --existing-config --account-pass=admin RUN vendor/bin/drush deploy // Simplified settings.bundle.php for the development image <?php // Enable the config split for the JSONRPC-API for the development image $config['config_split.config_split.jsonrpc']['status'] = TRUE; // Use SQLite instead of MariaDB $databases['default']['default'] = array ( 'database' =?> 'sites/default/files/.ht.sqlite', 'prefix' => '', 'driver' => 'sqlite', 'namespace' => 'Drupal\\sqlite\\Driver\\Database\\sqlite', 'autoload' => 'core/modules/sqlite/src/Driver/Database/sqlite/', );

We used the Configuration Split module by our own Fabian Bircher to ensure that the JSON-RPC module is only enabled during development.

Here is the Github Actions workflow we used to build and publish the image to the Github Container registry.

# Simplified Github actions workflow for building and publishing the development docker image name: Build and push development docker image on: push: # Usually the E2E tests can use the image built from the develop branch of the repo # But to test upcoming changes, we can prefix a branch with docker- to have this workflow build an image for it branches: ['develop', 'docker-*'] jobs: build-and-push-image: runs-on: ubuntu-latest # Grant the default `GITHUB_TOKEN` permission to read the current repo and push images to the Github container registry permissions: contents: read packages: write steps: - name: Log in to the Container registry uses: docker/login-action@v3 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.GITHUB_TOKEN }} # This action extracts the tags and labels that should be set on the published image # It exposes them as an output which is consumed by the next step through steps.meta - name: Extract metadata (tags, labels) for Docker id: meta uses: docker/metadata-action@v5 with: images: ghcr.io/github-org/ilo-live-cms # This action uses the git repo as the build context by default - name: Build and push Docker image uses: docker/build-push-action@v6 with: push: true file: docker/Dockerfile tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }}

This is the updated workflow on the frontend repo for using the development image.

# Simplified Github actions workflow for running the E2E tests using the development docker image name: Tests on: push jobs: cypress: name: Run Cypress tests runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v4 - name: Log in to the Container registry uses: docker/login-action@v3 with: registry: ghcr.io username: ${{ github.actor }} password: ${{ secrets.ACCESS_TOKEN }} - name: Run CMS container run: | docker run \ --detach \ -p 80:80 \ ghcr.io/github-org/ilo-live-cms:develop

With this setup, the development image is built within 5 minutes each time a pull request is merged into the develop branch of the CMS and the E2E tests workflow can start the CMS in less than a minute.

Image removed.

Image removed.

An added benefit of this approach is that it was much simpler to run the CMS. Previously, the only way to run the CMS was to clone the CMS repo and set up the project locally alongside the database server. Now it just needed a single command.

docker run --detach -p 80:80 ghcr.io/github-org/ilo-live-cms:develop

Putting everything together

To use the JSON-RPC API more conveniently within the Cypress tests, we defined several Custom Commands.

const getRPCPayload = (method, params = {}) => { // Get basic authorization header. const authorization = Buffer.from( [Cypress.env("CMS_USER"), Cypress.env("CMS_PASSWORD")].join(":") ).toString("base64"); return { method: "POST", url: `${Cypress.env("CMS_URL")}/path/to/jsonrpc`, headers: { Accept: "application/json", Authorization: `Basic ${authorization}`, }, body: { jsonrpc: "2.0", method, params, id: Math.floor(Math.random() *1000), }, }; }; Cypress.Commands.add("rpc", (method, params = {}) => { cy.request(getRPCPayload(method, params)).then((response) => { // Run assertions on the response from the CMS to ensure that the call ran successfully expect(response.status).to.eq(200); expect(response.body).to.have.property("result"); return response.body.result; }); }); // Cypress commands can call other commands and build on top of each other // So we created several utility functions to reduce repeated logic Cypress.Commands.add( "createPastEvent", ({ id = 1, type = "meeting", startDate, startHour = 8 } = {}) => { const event = { type, title: `Past ${type} ${id}`, // Not the most robust logic but sufficient for testing start_date: `2023-01-${startDate || id} ${startHour}:00:00`, end_date: `2023-01-${startDate || id} ${startHour + 1}:00:00`, }; cy.rpc("create.event", event); } ); // Command that should be run before each test run Cypress.Commands.add("setup", () => { // Cypress runs each test in isolation so that they can't interfere with each other // To enforce that isolation across the entire system, we use this command to reset CMS to its initial state cy.rpc("clean.content"); // Enable preview mode on Next.js to make it regenerate pages during each request // https://nextjs.org/docs/pages/guides/preview-mode cy.request("/path/to/enable-preview"); });

These custom commands made it much easier to orchestrate the CMS as required.

describe("Event page", () => { beforeEach(() => { cy.setup(); }); it("should display past meetings correctly", () => { // Call the JSONRPC-API to setup the content cy.createPastEvent().then((res) => { // Visit the page corresponding to the newly created content cy.visit(res.path); // Check if the elements on the page are rendered correctly cy.get("h1").should("contain", "Past meeting 1"); cy.contains("1 January 2023"); cy.contains("08:00 - 09:00 GMT+1"); }); }); });

Final results

Ultimately we implemented over 50 end-to-end tests for this project. The final test suite takes between 7 - 8 minutes to run on a standard Github hosted runner. (i.e. Not a large runner) This runtime is acceptable for us for now but it can be improved even further by using Parallelization.

Image removed.

While these tests took us some effort to implement, they definitely paid off in the long term. ILO Live had live event functionality which was mission-critical but very hard to orchestrate manually and with the E2E tests, we were able to ensure that it always worked. These E2E tests gave us the confidence to make big improvements to the codebase since we knew that bugs in key functionality would be instantly revealed and this is the biggest benefit of them all.

Written by Prabashwara Seneviratne, frontend developer at Nuvole and author of Frontend undefined.

Tags: Drupal PlanetNext.jsFrontendTest Driven Development

PreviousNext: Experience Builder has pushed the boundaries of what Drupal can do. Here's how we can push it further.

You've probably seen some excitement in the Drupal community around Experience Builder. The current version (0.5.0-alpha1) shows a giant leap forward in Drupal's page building and editing experience.

by lee.rowlands / 8 July 2025

Since late 2024, I've been engaged by Acquia to work on Experience Builder. This has been a particularly challenging project, combining many core Drupal concepts, such as the Form, Render, Entity, Plugin and Validation APIs, with newer front-end technologies like React, Redux, RTK-Query, Vite and Astro.

As we approach a beta release, I've been thinking about how we could push Experience Builder even further. Here are some of the things I'd love to work on next.

True multi-user editing

Currently, Experience Builder makes use of hashing to detect editing conflicts. If two users are editing the same page and one of them doesn't have the latest changes, the back end will prevent a user from overwriting another user's changes. This is similar to the 'The content has either been modified by another user, or you have already submitted modifications' error message we've had on content edit forms for a long time.

Unfortunately, this doesn't align with what users expect in 2025. Users are accustomed to multi-user editing experiences, such as Google Docs. Experience Builder has gone to great lengths to focus on building a modern front-end editing experience with React, so getting a 'reload to fetch latest changes' message detracts from that.

Luckily, we're well-positioned to support a multi-user workflow.

In Experience Builder's front-end, when you make a change to the page, you making use of Redux to keep track of the layout and model. Each change triggers a 'reducer' that takes the old state and returns a modified version of it.

Currently, we perform those changes on the front-end and then make either a PATCH or POST request to Drupal to update the preview and store the draft version. When you edit a single component, we make a PATCH request and update only that component.

When you perform any other operation, such as re-ordering a component, deleting a component, or adding a new component, we perform a POST request and update the entire model. Technically, in REST parlance, this is a PUT, but let's not split hairs.

True collaborative editing applications typically use a collaborative editing algorithm. There are two widely used algorithms and a third emerging one.

Google Docs uses the OT algorithm. It has low memory and storage requirements, but is slow to perform merges. CRDTs are more efficient when it comes to merging, but have higher data storage and memory requirements. EG Walker is an emerging algorithm that combines the best of OT and CRDT, aiming to reduce the data and memory overhead of CRDTs without losing the merge speed.

Because we use reducers to make changes to the front-end, we're already thinking of updates to the model as discrete operations. This should lend itself well to adopting a true collaborative editing algorithm that will allow us to achieve the multi-user editing experience users expect in 2025.

Real-time server-side events

Currently, the front-end polls the back-end every 10 seconds to update the list of pending changes. This is because we don't have a way of initiating updates from the server side.

Mercure is a lightweight real-time event server that uses Server Sent Events. Mercure was created by Kévin Dunglas from the Symfony Core team and easily integrates with PHP (there's even a module for it).

Adding real-time events to Experience Builder unlocks a huge number of usability improvements. Not only can we initiate updates from collaborative editing in real-time, we can also get really creative with the user experience and add features like:

  • Multi-user presence notifications - e.g. we could show the Avatar of users editing a page in the top bar, which is what Google Docs does
  • Highlight components another editor is editing - we could add a visual indicator to a component when another user is editing it

Obviously, we would need to take a progressive enhancement approach to this. We can't make Mercure a hard requirement for hosting a Drupal site. We could add these features when Mercure is available and configured, and silently ignore them when it isn’t.

Client-side preview updates

At present, Experience Builder does a round-trip back to Drupal to update the preview. This is because for SDC and block components, we need Twig to render the preview. Ideally, we could generate a preview from the user's browser and avoid this latency.

There are two ways we could tackle this.

The first approach is to ship a WASM running Twig. The Twig playground is an in-browser version of Twig that lets you write and execute Twig code. This utilises a WASM running PHP and Twig. If we took this approach, we could likely achieve a feature set very close to that of running Twig in Drupal. Obviously, there are some things we can't do. We would be limited to pure Twig templates that take input and produce output. Modules like Twig Tweak, which let your Twig templates interact with Drupal, couldn't be supported. Luckily, most of the things we're rendering in Experience Builder should largely be pure, such as SDCs and Blocks.

The second approach is a bit more experimental. In my spare time, I've been working on a library called twigCASTer. It takes the token stream parsed from Twig and turns it into a Component Abstract Syntax Tree. The Twig parser is only interested in tokens that it needs to render the output. It takes no position on the hierarchy of HTML. Markup in Twig files is seen as a string, with no context of the DOM tree it may represent. twigCASTer extends Twig tokens to build up a DOM tree whilst parsing the string content. From there, it can be cast into another component language. I've been experimenting with taking this AST and casting it into valid compiled JSX. 

It’s still very early days, but I do have a number of test-cases passing where Twig goes in and valid JSX, ready for consumption in the browser, comes out. This is a much more ambitious approach and will be tricky to get right. An obvious shortcoming of this approach would be many Drupal Twig extensions that have no JSX equivalent.

Summing up

Are you interested in working on any of these features? Do you have ideas or experience working with the technologies mentioned? Reach out in the Experience Builder channel. Or are you interested in sponsoring work on these features?  Get in touch with us to express your interest.

Talking Drupal: Talking Drupal #510 - Drupal Hooks: Drop 'em like they're hot

Today we are talking about Drupal Hooks, why they got changed in core, and what to do now with guest Karoly Négyesi better known as Chx. We’ll also cover Media Folders as our module of the week.

For show notes visit: https://www.talkingDrupal.com/510

Topics
  • Deep Dive into Drupal Hooks
  • The Evolution of Drupal Hooks
  • Challenges and Solutions in Hook Conversion
  • Community Involvement and Contributions
  • The Future of Drupal Hook System
  • Introduction to Procedural Hooks
  • Understanding Theme Hooks
  • Complexities of Preprocess Hooks
  • Converting Hooks to Object-Oriented
  • Impact on Contributed Modules
  • Challenges in Core Conversion
  • Future of Drupal Hooks
  • Lightning Round and Conclusion
Resources Guests

Károly Negyesi - ghost-of-drupal-past

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Martin Anderson-Clutz - mandclu.com mandclu

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted to have your Drupal site's media assets presented in a UI that evokes the hierarchy of a filesystem? There's a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Apr 2025 by João Mauricio (jmauricio)
    • Versions available: 1.0.3 which supports Drupal 10.3 and 11
  • Maintainership
    • Actively maintained
    • Security coverage
    • Test coverage
    • Number of open issues: 9 open issues, 2 of which are bugs, although one was just fixed
  • Usage stats:
    • 61 sites
  • Module features and usage
    • The module mimics a file structure by associating media entities with a taxonomy hierarchy
    • It then provides an intuitive, drag-and-drop UI to move items between locations, drag in new items, or even search within a particular “folder”, including a recursive search
    • When you drag in files, it uses “smart” logic to automatically assign files to Media bundles
    • It provides a form display widget, a view display widget, a CKEditor plugin, and it’s compatible with other filesystem modules, like S3 File System
    • This kind of interface is a requirement I’ve seen in RFPs by companies looking for a new CMS, so having this available as a drop-in solution

The Drop Times: People Behind the Projects

Every part of Drupal, from modules to core releases to translations, is built and maintained by people. Thousands of contributors shape this ecosystem: writing code, reviewing issues, reporting bugs, maintaining projects, improving accessibility, updating documentation, mentoring others, and supporting discussions. Much of this work happens quietly, without being seen beyond issue queues, commit logs, or Slack threads. Yet this invisible layer is what keeps Drupal alive and moving forward.

Drupal has always valued community, and it already has a strong culture of contribution. The credit system, contribution recognition on Drupal.org profiles, and core commit messages all reflect that. But there’s room to show more. Not just who contributed to the core, but who maintains projects that thousands of sites depend on. Who keeps the modules stable? Who takes the time to help others in forums? Who steps in when no one else does? These are not side efforts; they are part of what makes Drupal reliable.

Recognising this work more clearly, whether through module pages, community spotlights, contributor highlights in newsletters, or dashboards that show project health, can strengthen the whole ecosystem. It supports trust. It helps people build reputations. It encourages sustained involvement. It reminds everyone that Drupal is not just made of code—it’s made of people who care, and continue to show up. As the project evolves, creating more ways to acknowledge and celebrate that effort is a simple, powerful step forward.

DISCOVER DRUPAL

ORGANIZATION NEWS

EVENT

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now.

To get timely updates, follow us on LinkedIn, Twitter and Facebook. You can also join us on Drupal Slack at #thedroptimes.

Thank you, 
Sincerely 
Kazima Abbas
Sub-editor, The DropTimes.

DrupalCon News & Updates: DrupalCon North America 2026: Evolving for the Community

DrupalCon has always been a conference by the community, for the community—and as we look ahead to DrupalCon North America 2026 in Chicago, we’re making thoughtful changes to ensure it continues to reflect those values.

After a successful DrupalCon Atlanta, we’ve taken time to reflect, gather feedback, and make updates that prioritize access, sustainability, and community connection.  Each of the changes outlined below is rooted in one or more of these values—whether it's improving affordability, building lasting relationships, or creating a more efficient and inclusive event experience. With guidance from the DrupalCon North America Steering Committee, we’re excited to share a refreshed ticket structure, updated volunteer policies, a reimagined Expo Hall, and a renewed focus on summits, trainings, and collaboration.

What’s New for 2026

Ticket Pricing: More Affordable, More Accessible

We’ve simplified and lowered the cost of general admission tickets to make DrupalCon more accessible—without sacrificing the quality of experience our community expects. These changes were driven by feedback from past DrupalCon attendees, the North American Steering Committee, and the community at large, all of whom expressed a strong desire for more affordable access to the event.

Ticket Tier Atlanta 2025 Chicago 2026 Savings Early Bird $890 $575 $315 Regular $990 $700 $290 Late/Onsite $1,190 $850 $340

Early Bird registration opens September 15, 2025 and is open for 16 weeks!
Secure your ticket early to lock in the best rate.

Camp & Local Association Ticket Perks

For every 5 tickets purchased from a Drupal camp or local association, that community will receive 1 complimentary ticket to share with a deserving community member, with a max of 10 complimentary tickets per local camp or association. It's our way of reinvesting in local leadership and participation.

Updated Volunteer Ticket Policy

This change reflects our focus on access and sustainability. In our DrupalCon Atlanta recap blog, we highlighted how streamlined operations improved the event experience for attendees and volunteers alike. Building on that momentum, we recognized the need for clearer guidelines to ensure volunteer opportunities are distributed fairly and effectively.

We’ve updated the volunteer ticket structure to make it more equitable and scalable:

  • Volunteer under 20 hours → 25% discount
  • Volunteer 20+ hours → Complimentary ticket

These tickets are non-transferable and may not be combined with other discounts.

Previously, volunteer ticket codes were sometimes misused or distributed without proper oversight. These updated guidelines help preserve full complimentary tickets for those who contribute a significant amount of time and effort, while also creating new opportunities for others to attend at a reduced rate.

Additionally, we’ve streamlined the on-site registration process with self-check-in, reducing the need for a large number of on-site volunteers and allowing us to focus support where it’s most impactful.

Learn more and sign up to volunteer.

Summits & Trainings: Real Talk, Real Skills

Summits are one of DrupalCon’s most valuable opportunities for industry-specific collaboration and knowledge sharing. Designed to connect attendees working in the same verticals, these events offer focused access to speakers with real-world experience, engaging roundtable discussions with peers in similar roles, and meaningful conversations about shared challenges. Attendees walk away with practical takeaways and lasting connections, while participating sponsors have a chance to introduce themselves to leaders in the space in an organic, relevant way.

Taking place Monday, 23 March 2026.

Industry & Community Summits

Join peers in:

  • Healthcare
  • Higher Education
  • Government
  • Nonprofit
  • Community

Each summit features two half-day sessions that do not conflict with the main conference program, creating space for meaningful discussion and idea sharing.

Summit Type Atlanta 2025 Chicago 2026 Industry Summit $250 $300 Community Summit Free Free for RippleMaker members, $50 for non-member
(Click HERE to become a Ripple Maker)

Lunch is not included with the Community Summit, but a lunch ticket add-on will be available for purchase during registration.

Trainings

DrupalCon Trainings remain at $500 and offer deep-dive, expert-led learning opportunities on a wide range of Drupal skills.

More Community Updates

You’ll notice more networking spaces, and informal meeting zones—especially in the Expo Hall and hallways. We’re doubling down on meaningful, unstructured connections.

These changes are only possible through thoughtful cost management and the continued support of our sponsors. Their partnership helps us keep ticket prices accessible while delivering the high-quality experience the community expects. We’re grateful to those who invest in DrupalCon and help us create an event that welcomes and supports everyone.

Traveling from Outside the U.S.?

The Drupal Association is happy to issue official invitation letters for those requiring a visa.

Request your visa letter here.

Letters are generated automatically—just complete the form and check your email (including spam folders).

Key Dates

Milestone Date Program at a Glance Released 6 June 2025 Call for Speakers Opens 21 July 2025 Early Bird Registration Opens 15 September 2025 Call for Speakers Closes 26 September 2025 Grants & Scholarships Applications Open 1 October 2025 Grants & Scholarships Applications Close 31 October 2025 Session Notifications to Speakers 12 November 2025 Grant & Scholarship Recipients Announced 12 November 2025 Regular Registration Opens 5 January 2026 Conference Schedule Available 13 January 2026 Late Registration Opens 23 February 2026 DrupalCon Chicago 23-26 March 2026

Stay at the Heart of the Action

Hilton Chicago is DrupalCon’s official headquarters hotel—and it's where the magic happens.

From morning coffee chats to late-night strategy sessions in the lobby, this is where the community connects. Staying on-site helps you maximize your time, make spontaneous connections, and be part of the full experience.

Book your room at the Hilton Chicago.

Sponsorship Updates

We’re reimagining our sponsorship offerings to better connect you with the Drupal community—bringing fresh opportunities and updated packages designed for greater visibility, value, and impact.

Want to be the first to know when they go live? Email partnerships@association.drupal.org and we’ll make sure you're on the list.

Let’s Build What’s Next—Together

DrupalCon is more than just a conference—it’s the beating heart of our community. These changes help us keep that heart strong, inclusive, and accessible.

We can’t wait to see you in Chicago, 23-26 March 2026