Drupal blog: The evolution of Drupal's composability: from the command line to the browser

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Discover Drupal's future: how Automatic Updates and Project Browser will empower ambitious site builders with a no-code composable platform and modern development practices.

Image removed.

Drupal's modularity allows developers to combine and reuse modules, themes, and libraries to create custom solutions. This modularity is one of the key ingredients that makes Drupal a composable platform. The original motivation behind Drupal's modularity was to accelerate the pace of innovation and democratize the experience of site building.

This blog post has two main goals.

First, we'll explore how Drupal's composability is evolving to empower ambitious site builders with modern, no-code development practices. Through exciting initiatives like Automatic Updates and Project Browser, Drupal will simplify the task of installing, composing, and updating Drupal sites, all within the Drupal user interface.

Second, we'll provide a retrospective on the past 10+ years of decisions that have led to significant changes in how end-users install, extend, develop, and maintain Drupal sites. By delving into Drupal's innovation process through a timeline approach, we'll showcase key contributors, significant milestones, and pivotal shifts in thinking that have influenced Drupal's approach to composability.

Let's start!

2011

Drupal 7 was released, and introduced the "Update Manager". Derek Wright (3281d Consulting), Jacob Singh (Acquia), and Joshua Rogers (Acquia) had begun developing the Update Manager feature starting in 2009.

The Update Manager can be considered Drupal's first no-code update system. This feature introduced the ability for users to easily download and upload modules from Drupal.org to a Drupal site.

Under the hood, the Update Manager uses either the File Transfer Protocol (FTP) or Secure Shell Protocol (SSH). An end user can upload a module to their Drupal site through a form, and Drupal will FTP or SSH the module to the web server's file system.

Interestingly, fifteen years after its development started, Drupal 10 still uses the same basic Update Manager. However, this is about to change.

The Update Manager has several drawbacks: modules can conflict with each other, updates are applied directly to your live site, and if something goes wrong, there is no way to recover.

Image removed.
Sam Boyer and me creating the Drupal 8 branch on stage at DrupalCon Chicago.

In March 2011, we started working on Drupal 8, and later that year, in August, we agreed to adopt components from the Symfony project. This decision was made to help reduce the amount of code we had to build and maintain ourselves.

2012

The Symfony project was using Composer. Composer is a PHP package management system similar to npm. With Composer, developers can define the dependencies required by their PHP application in a file called composer.json. Then, Composer will automatically download and install the required components and their dependencies.

At first, we added Symfony components directly to Drupal Core's Git repository. Core Committers would regularly run composer update and commit their updated code to Drupal Core's Git repository. This left the end user experience relatively unchanged.

Some people in the Drupal community had concerns with storing third-party dependencies in Drupal Core's Git repository. To address this, we moved the Symfony components out of Git, and required Drupal's end users to download and install third-party components themselves. To do so, end users need to run composer install on the command line.

This approach is still used today. Drupal Core Committers maintain composer.json and composer.lock in Git to specify the components that need to be installed, and end users run composer install to download and install the specified components on their system.

Looking back, it is easy to see how embracing both Symfony components and Composer was a defining moment for Drupal. It made Drupal more powerful, more flexible, and more modular. It also helped us focus. But as will become clear in the remainder of this blog post, it also changed how end users install and manage Drupal. While it brought benefits, there were also drawbacks: it increased the maintenance, integration, and testing workload for end users. For many, it made Drupal more complex and challenging to maintain.

2013

We decided that Drupal Core would adopt semantic versioning. This marked a massive shift in Drupal's innovation model, moving away from long and unpredictable release cycles that broke backward compatibility between major releases.

To understand why this decision was important for Automatic Updates and Project Browser – and Drupal's composability more broadly – it's worth discussing semantic versioning some more.

Semantic versioning is a widely-used versioning system for software that follows a standard format. The format is X.Y.Z, where X represents the major version, Y the minor version, and Z the patch version.

When a new version is released, semantic versioning requires that the version number is updated in a predictable way. Specifically, you increment Z when a release only introduces backward compatible bug fixes. If new features are added in a backward compatible manner, you increment Y. And you increment X when you introduce changes that break the existing APIs.

This versioning system makes it easy to know when an automatic update is safe. For example, if a Drupal site is running version 10.0.2 and a security update is released as version 10.0.3, it's safe to automatically update to version 10.0.3. But if a major release is made as version 11.0.0, the site owner will need to manually update, as it likely contains changes that aren't compatible with their current version. In other words, the introduction of semantic versioning laid the groundwork for safe, easy Drupal updates.

2015

Drupal 8 was released. It came with big changes on all the fronts mentioned above: a shift towards object-oriented programming, support for Composer, the introduction of Symfony components, semantic versioning, and an unwavering commitment to simplifying upgrades for users.

Unfortunately, the reaction to Composer was mixed. Many Drupal contributors greatly appreciated the introduction of Composer, as it made it easier to share and utilize code with others. On the other hand, site owners often found it difficult to use Composer. Composer necessitates using the command line, something typically used by more advanced technical users. Moreover, unexpected failures during a Composer update can be complex to resolve for both developers and non-developers alike.

2016

The Drupal Association's engineering team, together with members of the community, launched the "Composer Façade". This meant that all Drupal.org hosted projects automatically became available as packages that could be installed by Composer.

There was some behind-the-scenes magic going on to help the Drupal community transition to Composer. For example, Drupal.org extensions were available to Composer even though they were not using semantic versioning.

Over the coming months and years, additional features would be added to Composer Façade, including solutions to help manage compatibility issues, sub-modules, and namespace collisions.

2017

Because Drupal has users with different levels of technical sophistication and different technical environments, we supported multiple distribution methods: zip files, tarballs, and Composer.

In the end, we were living in an increasingly Composer-centric world and updates via zip files or tarballs became less and less viable. So we agreed that we had to take a difficult path by fully embracing Composer. We began a long-running effort to make Composer easier for Drupal end users.

For example, the Drupal Association engineering team started building zip files and tarballs with Composer support: you could start with a zip/tar file, and then continue updating your site using Composer.

Separately, we also introduced new ways to install Drupal Core via Composer, such as using a new drupal/core-recommended project template. This template specifies the exact dependencies used to test a particular version of Drupal Core. Drupal Core is only released when all tests pass, so using drupal/core-recommended helps to prevent any problems caused by using different versions of the dependencies.

Image removed.
A slide from my keynote at DrupalCon Vienna 2017 where I introduced the Automatic Updates initiative. My speaker notes read: "Maybe Composer can be used under the hood to develop an automatic updates feature?".

Lastly, in my DrupalCon Vienna keynote, I declared the need for automatic updates, and made it a top priority for the Drupal community based on community surveys and interviews. This led to the formation of the Automatic Updates Initiative. The basic idea was to make updating Drupal sites easier by making Composer invisible to most users, thus empowering more people, regardless of their technical expertise.

2018

From 2017 into 2018, David Strauss (Pantheon) and Peter Wolanin (SciShield) took the lead on planning out the Automatic Updates Initiative, and presented possible architectural approaches at DrupalCon Nashville.

Their approaches drew inspiration from a multitude of Open Source projects like CoreOS, Fedora Atomic/Silverblue, and systemd. Some of the ideas outlined in their presentation have since been implemented. This is the beauty of Open Source; you can stand on the shoulders of other Open Source projects.

In 2018, the Drupal Security Team and Drupal Core release managers also extended the security coverage of Drupal minor releases from six to twelve months. This enabled site owners to update Drupal on their own schedule, but also introduced "security updates only" branches which will make automatic updates safer. This work was implemented with help from Ted Bowman, Emilie Nouveau, xjm, and Neil Drumm, with sponsorship from Acquia and the Drupal Association.

Later that year, at the Midwest Developer Summit organized by Michael Hess (University of Michigan), the new initiative team (composed of members of the Drupal Security Team, Drupal Association staff, and other interested contributors) defined a full initiative roadmap and began development. Key contributors were Angela Byron, David Strauss, Michael Hess, Mike Baynton, Neil Drumm, Peter Wolanin, Ryan Aslett, Tim Lehnen and xjm (sponsored by Acquia, Pantheon, the Drupal Association, SciShield, and the Universities of Michigan and Minnesota).

This work continued at Drupal Europe in Darmstadt, when the Automatic Updates Initiative team met with contributors from the Composer Initiative to compare needs and goals.

2019

In 2019, with sponsorship from the European Commission (EC), the Drupal Association contracted additional developers to build the first iteration of the Automatic Updates concept.

On the server-side, the funding from the EC resulted in all packages hosted on Drupal.org being signed with PHP Signify. PHP Signify is a PHP implementation of OpenBSD's Signify. PHP Signify assists in verifying the authenticity of Drupal modules, safeguarding against malicious forgeries. Additionally, Drupal extended OpenBSD's Signify to support chained signatures (CSIG) for better key rotation and maintenance.

On the client-side, the funding resulted in a contributed module for Drupal 7. Due to the European Commission's exclusive use of Drupal 7 at the time, a Drupal 8 module was out of scope. The Drupal 7 module updates tar-based installations of Drupal, as Composer wasn't introduced until Drupal 8.

In my DrupalCon Amsterdam keynote in late 2019, I provided an update on the Automatic Updates initiative with the assistance of Tim Lehnen from the Drupal Association:

2020

Up until 2020, contributed modules used version numbers like 8.x-2.1. This example meant the module was compatible with Drupal 8, and that it was major version 2 with patch level 1. In other words, we supported major and patch level releases, but no minor releases.

We finally updated Drupal.org to enable semantic versioning for contributed modules, which brought them up to date with best practices, including the ability to have minor releases, which was consistent with Drupal Core.

Composer Façade continued to support modules that had not adopted semantic versioning.

Image removed.
A slide from my keynote at DrupalCon Global 2020 where I gave an update on the Automatic Updates initiative. The slide shows the four major architectural building blocks of the Automatic Updates initiative.

Meanwhile, work on Automatic Updates continued. Because we had already embraced Composer, it seemed obvious that we would use Composer under the hood to power Automatic Updates. However, there was one feature we identified as missing from the existing Composer/Packagist ecosystem: package signing.

Composer's main security measure is at the transport layer: communication between the client (Drupal) and the package repository (drupal.org, packagist.org, github.com) is protected by https (TLS). However, we didn't believe that to be sufficient for an automatic update system.

Early in 2020, at a CMS security conference sponsored by Google, David Strauss proposed that Drupal implement The Update Framework (TUF), which would resolve several architectural issues with PHP Signify and also provide a specification to mitigate numerous kinds of supply chain attacks that we had not considered previously.

To start off this project, developers from the Drupal community met with leaders of TYPO3 (Benni Mack, Oliver Hader) and Joomla! (David Jardin, Tobias Zuluaf) to ensure this implementation of TUF would be beneficial not only to Drupal, but to the broader PHP ecosystem, especially to other Composer-driven projects.

With guidance from Trishank Karthik Kuppusamy (Datadog) and Joshua Lock (Python TUF), Ted Bowman, Adam Globus-Hoenich, xjm, David Strauss, David Stoline, and others developed PHP-TUF, with sponsorship from Acquia, Pantheon, and DDEV. PHP-TUF handles the client-side part of TUF that will run as part of every Drupal site.

At this time, the Drupal Association also began working on the server-side of the TUF implementation so that Drupal.org would be able to sign packages.

In addition to securing the update process with TUF, we also needed to figure out how to apply updates to a live site with minimal interruption. David Strauss, Mike Baynton, and Lucas Hedding (sponsored by Pantheon, Tag1, and MTech, respectively) had previously prototyped a blue-green deployment approach similar to the one used by CoreOS.

We decided that the required changes to support this would be too disruptive to Drupal, so we pivoted to a new approach proposed by David Strauss: to perform updates in a temporary copy of the site's codebase and then copy the changes to the live codebase as the final step.

While not perfectly atomic in the way that a blue-green deployment would have been, the key advantage to this approach is that it didn't require any changes to Drupal Core's file structure, which meant that it could also be easily adopted by other PHP projects. Travis Carden (Acquia) began implementing this approach as the Composer Stager library.

2021

Image removed.
A design proposal for Automatic Updates. There are updates available for different modules. You can upgrade them immediately using the user interface, or you can let the scheduler run to do it for you.

The second iteration of the Automatic Updates module was released as a beta. Unlike the first iteration sponsored by the European Commission, this version worked for Composer-based projects by leveraging the newly created Composer Stager library.

I had also gone on a virtual listening tour around the same time, and when I asked people why they fell in love with Drupal, the most common response had to do with the empowerment they felt from Drupal's no-code/low-code approach.

With that in mind, I proposed the Project Browser Initiative. The idea was that anyone should be able to install modules, including their third party dependencies, all without having to resort to using Composer on the command line.

This dovetailed nicely with the Automatic Updates initiative. The combination of Automatic Updates and Project Browser would give Drupal the equivalent of an 'app store', making it easy for anyone to discover, install, and update a module and its components.

2022

Image removed.
A design proposal for the Project Browser. Users can filter modules by category, development status, security policy and more. Users can also page through results or sort the results by the number of active installs.

In 2022, we began work on making Automatic Updates' Composer functionality available for Project Browser, so that module installs and updates are handled in the same seamless, robust way. The new Package Manager (a sub-module of Automatic Updates) provides this functionality for both Automatic Updates and Project Browser, and will be the cornerstone of Drupal's install and update functionality.

Ben Mullins (Acquia), Narendra Singh Rathore (Acquia) and Fran Garcia-Linares (Drupal Association) from the Project Browser team collaborated with Ted Bowman (Acquia), Adam Globus-Hoenich (Acquia), Kunal Sachdev (Acquia), Omkar Podey (Acquia), and Yash Rode (Acquia) from the Automatic Updates team in order to enhance the Package Manager's capabilities in order to cater to both use cases.

While work was ongoing on the client side of both Automatic Updates and the Project Browser, the Drupal Association remained focused on the server side. The Drupal Association put out an RFP to implement the TUF signing specification in a way that would integrate with Drupal.org's packaging pipeline. Together with Christopher Gervais and his team at Consensus Enterprises, they developed and released the Open Source Rugged server. A server-side TUF implementation that is the companion to the PHP TUF client implementation.

Drupal Association team member Fran Garcia-Linares also started work on new Drupal.org endpoints that will feed the necessary data for the Project Browser. These endpoints were built on modern Drupal, with JSON:API, and will be deployed to production in the first half of 2023.

2023

That brings us to today. Project Browser and Automatic Updates are still two of the biggest initiatives for Drupal. Chris Wells (Redfin Solutions) and Leslie Glynn (Redfin Solutions) are leading the Project Browser initiative, and Ted Bowman (Acquia) and Tim Lehnen (Drupal Association) are leading the Automatic Updates initiatives.

Both are built on top of Drupal's new Package Manager. Package Manager provides these initiatives with the ability to programmatically utilize Composer under the hood. Acquia and the Drupal Association are funding several people to work on these initiatives full-time, while other organizations like Redfin Solutions, Agileana, PreviousNext, Third & Grove, and more have provided extensive part-time contributions.

At the time of writing this, Narayan Newton (Tag1 Consulting), as part of the Drupal.org infrastructure team, is working on deploying the Rugged TUF server on the Drupal.org infrastructure. xjm (Agileana) and catch (Third & Grove), two of Drupal Core's release managers, are also collaborating on both the client and server sides of the initiative to help smooth the path to inclusion into Drupal Core.

We have built key parts of our solution in such a way that they can easily be adopted by any PHP project: from PHP-TUF to Rugged and Composer Stager. In the spirit of Open Source, our implementation was based on other Open Source projects, and now our work can be leveraged by others in turn. We encourage any PHP project that seeks to implement automated updates and a UI-based package manager to do so.

Automatic Updates is currently available as contributed module, which facilitates updates for Drupal Core. The Automatic Updates Extensions module (a sub-module that ships with Automatic Updates) provides automatic updates for contributed modules and themes. The Project Browser is also currently available as a contributed module.

Our goal is to have both Automatic Updates and Project Browser included in Drupal Core, making them out-of-the-box features for all end users. I'm hopeful we can take the final steps to flush out the remaining bugs, finalize the Drupal.org services and APIs, and move these modules to Drupal Core in the second half of 2023.

Conclusion

Getting Automatic Updates and Project Browser into Drupal Core will be the result of 10+ years of hard work.

After all these years, we believe Drupal's Automatic Updates and Project Browser to be both the most user-friendly and most security-conscious tools of their kind among all PHP applications.

We were also able to overcome most of the drawbacks of the original Drupal 7 Update Manager: Composer helps us manage module conflicts, and updates are first applied to a staged copy of the site's codebase to ensure they do not cause any unintended side effects.

In the end, Drupal will offer an 'app-store'-like experience. Drupal contributors can register, promote, update, version, and certify modules through Drupal.org. And Drupal end users can securely install and update modules from within their Drupal site without having to use the command line.

I'm excited about achieving this milestone because it will make Drupal a lot easier to use. In fact, Drupal will be easier to install and update than Drupal 7 ever was. Think about that. Furthermore, Drupal will help showcase how one can democratize composability and advanced dependency management. I'm optimistic that in a few years, we'll realize that adopting Composer for dependency management was the correct decision, even if it was difficult initially.

Hundreds of people have been involved in climbing to reach this summit, and hundreds more outside of the Drupal project have influenced and guided our thinking. I'm grateful to everyone involved in helping to make Drupal more composable and easier to use for people worldwide. Thank you!

Special thanks to Alex Bronstein, Christopher Wells, David Strauss, Derek Wright, Gábor Hojtsy, Lee Rowlands, Nathaniel Catchpole, Neil Drumm, Ted Bowman, Tim Lehnen, Tim Plunkett, Peter Wolanin, Wim Leers, and xjm for their contributions to this blog post. Taking a stroll down memory lane with you was a blast!

With the help of the above reviewers, I made an effort to acknowledge and give credit to those who deserve it. However, there is always a possibility that we missed significant contributors. If you have any corrections or additions, feel free to email me at dries@buytaert.net, and I'll update the blog post accordingly.

Drupal Association blog: Find Community at DrupalCon Pittsburgh 2023

DrupalCon is a community event, where people from around the world can come together and share in the mission to make Drupal the most impactful DXP in the world. Even beyond that, DrupalCon is for many people the one time a year to connect in person with the people they work with every day. We are so excited to share all the places where the community can connect in June! 

Community Summit 

Join us on Thursday, 8 June at DrupalCon Pittsburgh for a full-day unconference dedicated to exploring the issues that matter most to the Drupal community. What’s an unconference, you ask? It is a loosely structured conference that emphasizes sharing information instead of following a conventionally structured schedule. Together we will select topics that matter most to attendees and have collaborative discussions throughout the day. 

RSVP

There is a lot more information about unconferences online - Here are some great tips on how to prepare to attend an unconference. Please feel free to contact the organizers if you have questions: 

Who should attend?

You! Are you part of a Drupal community or want to start one in your city? This summit is meant to be a meeting place for all Drupalistas interested in the community. Whether you need help maintaining a long-standing camp or User Group, you’re new to the community and want to know where you can get involved, or you want to share a success story from your community, everyone is welcome.

What gets discussed?

In the morning: 

Presentations on the latest work being done from some of the Community Initiative Leads. Check out the detailed schedule page for session descriptions! 

In the afternoon: 

You decide! Topics at past events have included:

  • Contributing to the community
  • Organizing events
  • Improving diversity
  • Growing local communities
  • How to prevent burnout
  • How do we foster and grow mentorship programs

But it all gets decided on the day, by whoever is in the room.

The community summit isn’t all about talk, it’s about action as well. Networking is a powerful force for collaboration.

Check out the detailed schedule

Community summit success stories

Several initiatives have started as a conversation at past community summits:

  • The Event Organizer Working Group
  • Speaker Diversity workshops
  • Internship programs

Please RSVP and let us know what you’d like to discuss!

Community Initiatives 

Come by the community booth in the expo hall to learn more about Drupal Community Initiatives and how you can get involved! 

Social Events 

Social events are often where the magic happens at DrupalCon. Whether it’s a small group hike, coffee hang-outs, or organized parties by our amazing partners, social events help foster lifelong connections. Check out the social events page to stay up to date on what’s happening around the city during DrupalCon, and even submit one yourself! 

Social events at DrupalCon

Decompress 

The festivities at DrupalCon can be overwhelming at times, and it can be hard to find a space to be in community quietly. For a respite from the hustle and bustle, visit our Quiet Space, Non-Denominational Prayer Space, or rooftop terrace. 

Quiet Space - Room 313

This room is reserved for those who need space free from interaction with other participants in an environment where they feel free and safe to do so. The quiet rooms are free of conversation and interaction of any kind. They can be used for meditation, journaling, reading, or just silence away from the buzz of the Con. Soft seating, stimming tools, and relaxing art supplies are provided. 

Image removed.

Non-Denominational Prayer Space - Room 314

This room is reserved for those who wish to practice their faith in a quiet and respectful environment. Spiritual texts and prayer rugs are provided. We ask that participants use hushed voices when necessary, but practice silence as much as possible to respect other attendees who may be using the space.

Image removed.

Rooftop Terrace 

Relax in green space without having to leave the DLCC. The rooftop terrace overlooks the Duquesne River, and you’ll have a view of many of Pittsburgh’s bridges. Enjoy the sounds of the water below, or take a stroll along the Rooftop Boulevard, where gentle tones play from the speakers above. If you take a look at the swooping art fixture, you can also find inspiration quotes scrolling digitally. 

Image removed.

No matter how you like to be in community, there’s a place for you at DrupalCon. We can’t wait to share space with you! 

Matt Glaman: Creating fields programmatically and not through field configuration

Drupal is great for its content (data) modeling capabilities with its entity and field system. Not only is this system robust, but it is also completely manageable from a user interface! When fields are created through the user interface, they are managed through configuration. Drupal automates all schema changes required in the database based on this configuration when it is first created or when it might be removed. I have always had one issue with managing fields through configuration – your business logic relies on trusting that no one modifies this configuration. I prefer to define my fields programmatically, ensuring they exist and that fields tied to business logic are always present. Of course, this means I must also manage their installation and deletion for schema changes. But I prefer that level of control.

Specbee: Customizing Content Display in Drupal: A Guide to Display Modes

Customizing Content Display in Drupal: A Guide to Display Modes Pratik 04 Apr, 2023 Subscribe to our Newsletter Now Subscribe Leave this field blank

The primary function of a CMS is to create and manage content, as well as to allow users to search, customize, and publish content. The way your content is presented can decide how easy or difficult it is to use your website. In Drupal, display modes are different ways of presenting your content on your website. For example, you can show all the information about an article on a website, like a title, author, date, and full text, or you can show just a summary of it. Display modes let you choose what information you want to show and how it's arranged.

Display modes are built into Drupal core and they provide an easy way for developers to control how their content appears on the page. Content types in Drupal provide different types of display modes, each with its own features. For example, nodes (content pieces) can have multiple displays, such as teaser, full, list, and even a custom view. While some content types are limited to one display type (e.g. single page), other content types can have multiple display modes.

This article will help you get a deeper understanding of display modes, their uses, applications and extensions. So let’s get started!

Image removed.

What are Display Modes

Display modes are one of the core features of Drupal that help us manage the display of content and form entities differently for different situations. Content entities include taxonomy, nodes, paragraphs, and forms to add/edit content entities.

Similar to other configuration entities, like content types, display modes can be exported and imported as configuration files(.yml). Display modes have two parts, view mode, and form mode.

Why use them?

Display modes are just one of the many features that make Drupal great, and they provide users with a way to customize their websites by allowing them to control how different types of content are displayed on their sites. It also helps: 

  1. Render content on the page only when required. The fields we don't need can be skipped.
  2. Increase site performance.
  3. Reduce the necessity to create multiple or duplicate content.

Working with Display Modes

Display modes can be accessed from structures.

Image removed.

 

As discussed previously, there are two types of display modes:

  1. View Mode
  2. Form Mode

Using the View Mode

View modes help to manage the display of the content item.
Different modes can be used to display different types of entities. Like view mode for taxonomy content can be used for taxonomies contents only. 

You can create new view modes or modify existing view modes.

To create a new view mode, click on “Add view mode” and choose the type for which you want to create the view mode. For example, select “content”. This will create a view mode for contents of a content type.

Image removed.

For configuring a view mode, go to the content type settings for which you want to use the view mode. Next, go to “Manage display”. At the bottom within “Custom display settings”, you can select any of the available view modes that you want to use.

Image removed.

Here we have selected three view modes. Next, save this configuration. You will then see that these view modes are visible in the “Manage display” tab.

Image removed.

You can make different configurations for displaying the content on the front end for each mode. For example, if you need to disable a field content or render it differently.

Few common use cases:

  1. Entity reference field
  2. View items
  3. Paragraphs

Form Mode

Form modes can be used to simplify forms for certain use cases, such as creating a simpler form for user registration, or to add additional fields and functionality to a form for more complex use cases. Most commonly, you want to display forms according to user roles.

Like view mode, you can create or configure your own form mode. If you want to use a form mode for a node add/edit form, you can do it in the “Manage form display”.

Form mode features will not work directly from the UI for all forms. If you need to implement for node forms, you have to do it programmatically.

If you want some easy UI options for using form mode we can use the Form mode manager and form mode control contributed modules.

Form mode Manager

After installing the module, you will see the form mode listed on the module configuration page.

Image removed.

Here you can manage which form mode to use for a user by providing permissions.

Image removed.

 

Form Mode Control

The module also simplifies the process of using form modes.

After enabling the module, you can see form modes on the configuration page. You can choose which form to use.

Image removed.

 

If you want to use the form mode for other user roles, you have to provide relevant permissions.

Image removed.

 

You can set up the user roles on the configuration page.

Image removed.

Final Thoughts

Features like display modes make Drupal a great CMS to work with as they provide a high degree of flexibility and customization for how content is displayed on your website. Display modes allow you to create different layouts and designs for your content, giving you the ability to showcase your content in unique and engaging ways. We hope this brief tutorial will help you understand and work with Display modes more effectively. If you like what you read here, consider subscribing to our weekly newsletter so you don’t miss out on any of our latest articles.

Author: Pratik

Meet Pratik, a Drupal developer by profession. He enjoys reading and is always eager to learn new languages. Someday, he would love to visit Italy, his dream travel destination. He enjoys diverse cuisines and has a passion for food.

Email Address Subscribe Leave this field blank Drupal Drupal Development Drupal Module Drupal Planet

Leave us a Comment

 

Recent Blogs

Image Image removed.

Customizing Content Display in Drupal: A Guide to Display Modes

Image Image removed.

How to Create Dynamic Layouts with Layout Builder, CTools, and View Modes

Image Image removed.

A Journey of Persistence - Shreeganesh’s Career Story

Want to extract the maximum out of Drupal? TALK TO US

Featured Case Studies

Image removed.Image removed.

Upgrading the web presence of IEEE Information Theory Society, the most trusted voice for advanced technology

Explore
Image removed.Image removed.

A Drupal powered multi-site, multi-lingual platform to enable a unified user experience at SEMI

Explore
Image removed.Image removed.

Great Southern Homes, one of the fastest growing home builders in the US, sees greater results with Drupal

Explore
View all Case Studies

Talking Drupal: Talking Drupal #393 - Drupal & Javascript

Today we are talking about Drupal & JavaScript with Andy Blum.

For show notes visit: www.talkingDrupal.com/393

Topics
  • Talk at FLDC
  • Important Drupal JS features
  • Drupal behaviors
  • Why use JS
  • Users with no JS
  • jQuery
  • Front end framework
  • Bigpipe
  • JS components single folder components
  • Future of JS in Drupal
Resources Guests

Andy Blum - @andy_blum

Hosts

Nic Laflin - www.nLighteneddevelopment.com @nicxvan John Picozzi - www.epam.com @johnpicozzi Kat Shaw - drupal.org/u/katannshaw @katannshaw

MOTW Correspondent

Martin Anderson-Clutz - @mandclu Real-time SEO for Drupal Provides content authors immediate feedback about how optimized their content is against specific keywords.

The Drop Times: Dear Readers, It Is All in Your Hands

It would be a slight change of scenery for you readers—an inexperienced writer in the works. I have always preferred people who knew what they were doing. Of course, from my stance, that preference would be because I want to learn. But when TDT’s editorial meeting puts confidence in a newcomer, and she is tasked to write a personal note to our avid readers, that sounds like an opportunity. And here I am grabbing that moment. 

Writing as a profession was revealed to me quite early on, but the art of writing is what I struggle with everyday. I don’t want to speak for everyone, but, to be honest, almost every writer could feel this way.

Now, I do have a simple plan. To observe is to learn. And my first target is to learn from my mentors. Taking my own advice and looking into what our Editor-in-Chief mentioned in a previous newsletter:

It is the baby steps that matter. And once you are thorough, it comes naturally to you. To reach that level need patience and practice. ~ VOL. 1 ISSUE. 7

The words, tone, structure of the sentences, references, and how the stories are told—Storytelling is a wonder. Has the media mastered the art of storytelling? A liar could have that skill as well. Apparently, it is all in the details, not too much or too little.

I think it is all perception and the power of discernment. It is in your hand, dear readers, all in yours. So, if you believe my writing is good and faithful, it could be. But if you don’t, it is all in your hands again. Until you decide, I’ll take you through what we have covered on TDT this past week.

I want to point out that writing a news story is a bit different. As you know, it is all about accuracy and facts. There aren’t many areas for storytelling, or are there? 

Now straight to the week’s stories. 

The DrupalCamp Poland proposals deadline is approaching soon, and the Drupal User Group Hamburg & Schleswig Holstein hosted an in-person meetup on March 30.

As the EoL of Drupal 7 nears, Chromatic released a Podcast that discusses the implications of Drupal 7 End of Life. Another exciting release was Droptica’s E-Book on SEO Strategies for Drupal Websites.

NERD Summit ended last month; here are the summit’s statistics: 2023 vs. before, excluding the covid years. DrupalSouth calls for a broad panel of judges for the splash awards. Fox Valley Drupal will host the midCamp preview and ramp-up. Acquia stands out as a leader in the 2023 Gartner Magic Quadrant for DXPs and SystemSeed Shortlisted for the 2023 Global Business Tech Awards.

There will be a Drupal4Gov’s in-person half-day seminar on April 18, so keep an eye out for that and also make sure to read an article shared on opensource.com, titled “How to encourage positive online communication in your open source community,

There is more to ramble on about, but these were a few favorites picked in all truth. That’s it for this week.

Sincerely,
Alethia Rose Braganza
Sub-Editor.

ComputerMinds.co.uk: How to: Implement an automated Commerce Order state transition

Image removed.

A common requirement for any website that sells products is to have a mechanism in place that ensures orders placed on the website are 'Exportable' - being made available as a file that can be sent across to a different system, to handle the processing of the order.

The Drupal Commerce 2.x module (for Drupal 9, 10) has the concept of order 'Workflows', along with defined 'States' and 'Transitions'. A workflow is a set of states and transitions that the order will go through during its lifecycle. Transitions are applied to an order and this will progress the order from one state to another. By default, a few workflows are provided but for sites that we build, we usually write our own order workflows to handle the specific requirements of the site in question.

Image removed.

A comparison of one of the default Commerce order workflows (on the left) and our own custom workflow (on the right). For more information about creating a custom order workflow, see the excellent documentation on drupalcommerce.org.

One common requirement that doesn’t quite work properly ‘out of the box’ is the ability for an order to be automatically transitioned to a separate 'exported' or 'completed' state, immediately after the order has been paid for successfully. We can achieve the desired functionality with a combination of an event subscriber and a useful entity update hook. Here’s what we did:

Step 1 - Implement the event subscriber.

We’ve already covered event subscribers before in a number of our articles so I won’t go into the specifics here, but the first thing we’ll want to do is create a new one that will be able to react to our Orders’ state transition.

N.B. In this example, we are assuming our custom code will live in a module that already exists, named ‘example_order_exporter’.

First off, we’ll add a new file inside of our custom module to handle our event subscriber. In the example code in this article, we’ll be naming it OrderExporterSubscriber.php and the full path it should live under is example_order_exporter/src/EventSubscriber/OrderExporterSubscriber.php

namespace Drupal\example_order_exporter\EventSubscriber; use Drupal\example_order_exporter\OrderExporter; use Drupal\state_machine\Event\WorkflowTransitionEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; class OrderExporterSubscriber implements EventSubscriberInterface { /** * @var \Drupal\example_order_exporter\OrderExporter */ protected $orderExporter; /** * {@inheritDoc} */ public function __construct(OrderExporter $order_exporter) { $this->orderExporter = $order_exporter; } /** * The 'place' transition takes place after the order has been paid for in * full, so we'll subscribe to the post_transition event at this point. */ public static function getSubscribedEvents() { return [ 'commerce_order.place.post_transition' => ['onOrderPlace', -100], ]; } /** * Act on an order after it has been paid for. * * We simply set a success / failure flag for the export here, and then * handle the state change in a hook_commerce_order_update() implementation * as the previous state change to payment will have finished by that point. * * @see example_order_exporter_commerce_order_update() */ public function onOrderPlace(WorkflowTransitionEvent $event) { /** @var \Drupal\commerce_order\Entity\OrderInterface $order */ $order = $event->getEntity(); // Call our service to export the order. $exported = $this->orderExporter->export($order); // Update the success flag on the order of the results of the export. $order->setData('export_success', $exported); } }

Nothing scary here, we are just subscribing to the ‘commerce_order.place.post_transition’ event and defining a function to run in reaction to that event (onOrderPlace). Each commerce order transition defined in your order workflow automatically has a ‘pre_transition’ and ‘post_transition’ event made available, and we are subscribing to the post_transition event for the place transition. In this example, the place transition is the specific transition that happens when the order has been successfully paid for.

Inside our onOrderPlace() function, we get the order entity that is available in this event, calling the custom service that we have injected into the class, to do the exporting of the order. We then set a flag on the order using the setData method (which lets us set arbitrary data against the order) with the result of the order export from our custom service.

Of course, nothing is stopping you from having code inside of this onOrderPlace() function that does the actual exporting but we usually like to separate the logic into its own service class. This separation approach means that we can then easily call the order exporting service in other places that we might want it, such as a Drush command.

We’ll also need to let Drupal know about our new event subscriber so will need to add an entry into our module’s services.yml file, e.g.

services: example_order_exporter.order_export_subscriber: class: Drupal\example_order_exporter\EventSubscriber\OrderExporterSubscriber arguments: ['@example_order_exporter.order_exporter'] tags: - { name: event_subscriber }

In this example, we have also specified our custom order exporter service as an argument to our order subscriber (that would also have a definition in this .yml file). If you have decided to include all of the order export logic directly inside the onOrderPlace() function (or aren’t using a custom service to do such a thing) then you can omit this from the arguments to the subscriber and from the __construct() method of the subscriber in the OrderExporterSubscriber.php file.

Step 2 - The hook_commerce_order_update() implementation

The second part of the work here is to have a hook_commerce_order_update implementation inside of our custom module that will handle checking the order data that we set previously in our subscriber, and then apply the appropriate transition if successful.

It’s important that we do this here in the hook implementation! If we try to apply the transition directly inside of the onOrderPlace function from our event subscriber, then this would start the transitioning process for our new transition before the other transition has fully finished. This means that the original transition wouldn’t have necessarily been completed and any functionality driven from the transition we just applied would run, and then the original state transition that we subscribed to would try and finish off afterwards.

Aside from being logically incorrect, this leads to weird inconsistencies in the order log where you end up with something like this, e.g.

"Order state was moved from Draft to Exported" "Order state was moved from Exported to Exported"

Instead of what it should be:

"Order state was moved from Draft to Placed" "Order state was moved from Placed to Exported"

Here’s the sample code that would need to go into the .module file of the example_order_exporter module (example_order_exporter.module).

use Drupal\commerce_order\Entity\OrderInterface; /** * Implements hook_commerce_order_update(). */ function example_order_exporter_commerce_order_update(OrderInterface $order) { // Check that the current state is 'payment' (i.e. we have finished // transitioning to the payment state) and that the export_success flag is // set to TRUE on the order, which indicates our event subscriber that exports // the order has successfully run. // This hook is called *after* the transition has finished, so we can safely // apply the new transition to 'completed' here. $order_state = $order->getState(); $current_state = $order_state->getId(); if ($current_state === 'payment') { if ($order->getData('export_success') === TRUE) { $order_state_transitions = $order_state->getTransitions(); $transition_to_apply = 'completed'; // Update the state of the order by applying the transition. // If the order state transitions are empty then this order is // 'completed' and shouldn't have its state changed. if (isset($order_state_transitions[$transition_to_apply])) { $order_state->applyTransition($order_state_transitions[$transition_to_apply]); $order->save(); } } } }

We check the current state of the order is what we expect it to be - payment - and if the export_success flag we set previously is true. Only if these two conditions are true do we then try to apply the transition with the useful applyTransition method on the order state.

The $order_state has a useful method called getTransitions which returns an array of all the valid transitions for the order in its current state. This allows us to do a quick check to be sure that the ‘completed’ state (the final state of our order workflow) is present in the list of allowed transitions, before trying to apply it. If not present, this would mean this order has already been exported, and we don't try to export the order again.

We are being slightly sneaky here by calling a save on the order at this point in time, as this commerce_order_update hook is triggered during the postSave hook of the original order save. To be safe, we’ll use the (always handy) hook_module_implements_alter to ensure that our hook implementation here always runs last. This will ensure we aren’t accidentally stopping any other module’s hook_commerce_order_update implementation from running before ours.

/** * Implements hook_module_implements_alter(). */ function example_order_exporter_module_implements_alter(&$implementations, $hook) { if ($hook == 'commerce_order_update') { $group = $implementations['example_order_exporter']; unset($implementations['example_order_exporter']); $implementations['example_order_exporter'] = $group; } }

So there you have it, a nice clean way of having an automated transition run on the order without getting into any weird state transition issues caused by calling applyTransition whilst reacting to a previous transition. In this example, it’s used purely when we are exporting an order after payment has been made, but nothing is stopping you from reacting to other order transitions depending on your workflow’s needs!

LN Webworks: The 8-Step Anatomy of a Successful UX Design Process

UX Design is a cornerstone of building user-centric websites and applications. The demand for UX designers around the globe has significantly increased and the graph seems to climb every year. Why do you think that’s happening? It’s because building human-centric applications and websites is no longer an option. When UX design services combines with Drupal, a robust CMS, that makes it a deadly combination to make people choose your brand because of superior functionalities and user experience. But what is the exact UX design process? In the article, you will learn a simple (yet powerful) 8-step UX design process to implement in your business. What is a UX Design Process? A UX design process is a systematic approach to creating user-centered designs and building websites and applications that aligns with users’ needs and preferences. Here’s the cache: A UX design process that worked for one business may not work for another business. For example, the UX process for building a travel website is different from that of building a matrimonial app.