drupal

Dries Buytaert: The little HTTP Header Analyzer that could

HTTP headers play a crucial part in making your website fast and secure. For that reason, I often inspect HTTP headers to troubleshoot caching problems or review security settings.

The complexity of the HTTP standard and the challenge to remember all the best practices led me to develop an HTTP Header Analyzer four years ago.

It is pretty simple: enter a URL, and the tool will analyze the headers sent by your webserver, CMS or web application. It then explains these headers, offers a score, and suggests possible improvements.

For a demonstration, click https://dri.es/headers?url=https://www.reddit.com. As the URL suggests, it will analyze the HTTP headers of Reddit.com.

I began this as a weekend project in the early days of COVID, seeing it as just another addition to my toolkit. At the time, I simply added it to my projects page but never announced or mentioned it on my blog.

So why write about it now? Because I happened to check my log files and, lo and behold, the little scanner that could clocked in more than 5 million scans, averaging over 3,500 scans a day.

So four years and five million scans later, I'm finally announcing it to the world!

If you haven't tried my HTTP header analyzer, check it out. It's free, easy to use, requires no sign-up, and is built to help improve your website's performance and security.

The crawler works with all websites, but naturally, I added some special checks for Drupal sites.

I don't have any major plans for the crawler. At some point, I'd like to move it to its own domain, as it feels a bit out of place as part of my personal website. But for now, that isn't a priority.

For the time being, I'm open to any feedback or suggestions and will commit to making any necessary corrections or improvements.

It's rewarding to know that this tool has made thousands of websites faster and safer! It's also a great reminder to share your work, even in the simplest way – you never know the impact it could have.

Dries Buytaert: The Watchmaker's Approach to Web Development

Image removed.

Since 1999, I've been consistently working on this website, making it one of my longest-standing projects. Even after all these years, the satisfaction of working on my website remains strong. Remarkable, indeed.

During rare moments of calm — be it a slow holiday afternoon, a long flight home, or the early morning stillness — I'm often drawn to tinkering with my website.

When working on my website, I often make small tweaks and improvements. Much like a watchmaker meticulously fine-tuning the gears of an antique clock, I pay close attention to details.

This holiday, I improved the lazy loading of images in my blog posts, leading to a perfect Lighthouse score. A perfect score isn't necessary, but it shows the effort and care I put into my website.

Image removed.Screenshot of Lighthouse scores via https://pagespeed.web.dev/.

I also validated my RSS feeds, uncovering a few opportunities for improvement. Like a good Belgian school boy, I promptly implemented these improvements, added new PHPUnit tests and integrated these into my CI/CD pipeline. Some might consider this overkill for a personal site, but for me, it's about mastering the craft, adhering to high standards, and building something that is durable.

Last year, I added 135 new photos to my website, a way for me to document my adventures and family moments. As the year drew to a close, I made sure all new photos have descriptive alt-texts, ensuring they're accessible to all. Writing alt-texts can be tedious, yet it's these small but important details that give me satisfaction.

Just like the watchmaker working on an antique watch, it's not just about keeping time better; it's about cherishing the process and craft. There is something uniquely calming about slowly iterating on the details of a website. I call it the The Watchmaker's Approach to Web Development, where the process holds as much value as the result.

I'm thankful for my website as it provides me a space where I can create, share, and unwind. Why share all this? Perhaps to encourage more people to dive into the world of website creation and maintenance.

Dries Buytaert: Effortless inspecting of Drupal database queries

Drupal's database abstraction layer is great for at least two reasons. First, it ensures compatibility with various database systems like MySQL, MariaDB, PostgreSQL and SQLite. Second, it improves security by preventing SQL injection attacks.

My main concern with using any database abstraction layer is the seemingly reduced control over the final SQL query. I like to see and understand the final query.

To inspect and debug database queries, I like to use MariaDB's General Query Log. When enabled, this feature captures every query directly in a text file.

I like this approach for a few reasons:

  • It directly captures the query in MariaDB instead of Drupal. This provides the most accurate view of the final SQL queries.
  • It enables real-time monitoring of all queries in a separate terminal window, removing the need to insert debug statements into code.
  • It records both successful and unsuccessful queries, making it a handy tool for debugging purposes.

If you're interested in trying it out, here is how I set up and use MariaDB's General Query Log.

First, log in as the MySQL super user. Since I'm using DDEV for my development, the command to do that looks like this:

[code bash]$ ddev mysql -u root -proot[/code]

After logging in as the MySQL super user, you need to set a few global variables:

[code shell]SET global general_log = 1; SET global log_output = 'file'; SET global general_log_file = '/home/dries/queries.txt';[/code]

This will start logging all queries to a file named 'queries.txt' in my home directory.

DDEV employs containers, isolated environments designed for operating specific software such as databases. To view the queries, you must log into the database container.

[code bash]$ ddev ssh -s db[/code]

Now, you can easily monitor the queries in real-time from a separate terminal window:

[code bash]$ tail -f /home/dries/queries.txt[/code]

This is a simple yet effective way to monitor all database queries that Drupal generates.

Dries Buytaert: The new old: Jamstack and MACH's journey towards traditional CMS concepts

Image removed.

In recent years, new architectures like MACH and Jamstack have emerged in the world of Content Management Systems (CMS) and Digital Experience Platforms (DXP).

In some way, they have challenged traditional models. As a result, people sometimes ask for my take on MACH and Jamstack. I've mostly refrained from sharing my perspective to avoid controversy.

However, recognizing the value of diverse viewpoints, I've decided to share some of my thoughts. I hope it contributes positively to the ongoing evolution of these technologies.

The Jamstack is embracing its inner CMS

Jamstack, born in 2015, began as an approach for building static sites. The term Jamstack stands for JavaScript, APIs and Markup. Jamstack is an architectural approach that decouples the web experience layer from content and business logic. The web experience layer is often pre-rendered as static pages (markup), served via a Content Delivery Network (CDN).

By 2023, the Jamstack community underwent a considerable transformation. It evolved significantly from its roots, increasingly integrating traditional CMS features, such as "content previews". This shift is driven by the need to handle more complex websites and to make the Jamstack more user-friendly for marketers, moving beyond its developer-centric origins.

Netlify, a leader in the Jamstack community, has acknowledged this shift. As part of that, they are publicly moving away from the term "Jamstack" towards being a "composable web platform".

Many traditional CMSes, including Drupal, have long embraced the concept of "composability". For example, Drupal has been recognized as a composable CMS for two decades and offers advanced composable capabilities.

This expansion reflects a natural progression in technology, where software tends to grow more complex and multifaceted over time, often referred to as the law of increasing complexity. What starts simple becomes more complex over time.

I believe the Jamstack will continue adding traditional CMS features such as menu management, translation workflows, and marketing technology integrations until they more closely resemble traditional CMSes.

Wake-up call: CMSes are hybrid now

The Jamstack is starting to look more like traditional CMSes, not only in features, but also in architecture. Traditional CMSes can work in two main ways: "coupled" or "integrated", where they separate the presentation layer from the business logic using an approach called Model-View-Controller (MVC), or "decoupled", where the front-end and back-end are split using web service APIs, like the Jamstack does.

Modern CMSes integrate well with various JavaScript frameworks, have GraphQL backends, can render static pages, and much more.

This combination of both methods is known as "hybrid". Most traditional CMSes have adopted this hybrid approach. For example, Drupal has championed a headless approach for over a decade, predating both the terms "Jamstack" and "Headless".

Asserting that traditional CMSes are "monolithic" and "outdated" is a narrow-minded view held by people who have overlooked their evolution. In reality, today's choice is between a purely Headless or a Hybrid CMS.

Redefining "Jamstack" beyond overstated claims

As the Jamstack becomes more versatile, the original "Jamstack" name and definition feel restrictive. The essence of Jamstack, once centered on creating faster, secure websites with a straightforward deployment process, is changing.

It's time to move beyond some of the original value proposition, not just because of its evolution, but also because many of the Jamstack's advantages have been overstated for years:

In short, Jamstack, initially known for its static site generation and simplicity, is growing into something more dynamic and complex. This is a positive evolution driven by the market's need. This evolution narrows the gap between Jamstack and traditional CMSes, though they still differ somewhat – Jamstack offers a pure headless approach, while traditional CMSes offer both headless and integrated options.

Last but not least, this shift is making Jamstack more similar to MACH, leading to my opinion on the latter.

The key difference between MACH and traditional CMSes

Many readers, especially people in the Drupal community, might be unaware of MACH. MACH is an acronym for Microservices, API-first, Cloud-native SaaS, Headless. It specifies an architectural approach where each business function is a distinct cloud service, often created and maintained by separate vendors and integrated by the end user.

Imagine creating an online store with MACH certified solutions: you'd use different cloud services for each aspect – one for handling payments, another for your product catalog, and so on. A key architectural concept of MACH is that each of these services can be developed, deployed, and managed independently.

At first glance, that doesn't look too different from Drupal or WordPress. Platforms like WordPress and Drupal are already integration-rich, with Drupal surpassing 50,000 different modules in 2023. In fact, some of these modules integrate Drupal with MACH services. For example, when building an online store in Drupal or WordPress, you'd install modules that connect with a payment service, SaaS-based product catalog, and so on.

At a glance, this modular approach seems similar to MACH. Yet, there is a distinct contrast: Drupal and WordPress extend their capabilities by adding modules to a "core platform", while MACH is a collection of independent services, operating without relying on an underlying core platform.

MACH could benefit from "monolithic" features

Many in the MACH community applaud the lack of a core platform, often criticizing it as "monolithic". Calling a CMS like Drupal "monolithic" is, in my opinion, a misleading and simplistic marketing strategy. From a technical perspective, Drupal's core platform is exceptionally modular, allowing all components to be swapped.

Rather than (mis)labeling traditional CMSes as "monolithic", a more accurate way to explain the difference between MACH and traditional CMSes is to focus on the presence or absence of a "core platform".

More importantly, what is often overlooked is the vital role a core platform plays in maintaining a consistent user experience, centralized account management, handling integration compatibility and upgrades, streamlining translation and content workflows across integrations, and more. The core platform essentially offers "shared services" aimed to help improve the end user and developer experience. Without a core platform, there can be increased development and maintenance costs, a fragmented user experience, and significant challenges in composability.

This aligns with a phenomenon that I call "MACH fatigue". Increasingly, organizations come to terms with the expenses of developing and maintaining a MACH-based system. Moreover, end-users often face a fragmented user experience. Instead of a seamlessly integrated interface, they often have to navigate multiple disconnected systems to complete their tasks.

To me, it seems that an ideal solution might be described as "loosely-coupled architectures with a highly integrated user experience". Such architectures allow the flexibility to mix and match the best systems (like your preferred CMS, CRM, and commerce platform). Meanwhile, a highly integrated user experience ensures that these combinations are seamless, not just for the end users but also for content creators and experience builders.

At present, traditional platforms like Drupal are closer to this ideal compared to MACH solutions. I anticipate that in the coming years, the MACH community will focus on improving the cost-effectiveness of MACH platforms. This will involve creating tools for managing dependencies and ensuring version compatibility, similar to the practices embraced by Drupal with Composer.

Efforts are already underway to streamline the MACH user experience with "Digital Experience Composition (DXC) tools". DXC acts as a layer over a MACH architecture, offering a user interface that breaks down various MACH elements into modular "Lego blocks". This allows both developers and marketers to assemble these blocks into a digital experience. Users familiar with traditional CMSes might find this a familiar concept, as many CMS platforms include DXC elements as integral features within their core platform.

Traditional CMSes like Drupal can also take cues from Jamstack and MACH, particularly in their approaches to web services. For instance, while Drupal effectively separates business logic from the presentation layer using the MVC pattern, it primarily relies on internal APIs. A shift towards more public web service APIs could enhance Drupal's flexibility and innovation potential.

Conclusions

In recent years, we've witnessed a variety of technical approaches in the CMS/DXP landscape, with MACH, Jamstack, decoupled, and headless architectures each carving out their paths.

Initially, these paths appeared to diverge. However, we're now seeing a trend of convergence, where different systems are learning from each other and integrating their unique strengths.

The Jamstack is evolving beyond its original focus on static site generation into a more dynamic and complex approach, narrowing the gap with traditional CMSes.

Similarly, to bridge the divide with traditional CMSes, MACH may need to broaden its scope to encompass shared services commonly found in the core platform of traditional CMSes. That would help with developer cost, composability and user-friendliness.

In the end, the success of any platform is judged by how effectively it delivers a good user experience and cost efficiency, regardless of its architecture. The focus needs to move away from architectural considerations to how these technologies can create more intuitive, powerful platforms for end users.

PreviousNext: How can free open source CMSes remain competitive with enterprise clients?

With Drupal now heavily used in the enterprise market by very large organisations, much of its direct competition is from well-funded proprietary products. From the perspective of my role on the Drupal Association board, I gave a talk at FOSDEM in February 2024 on the strategies and initiatives the Drupal community is starting to put in place to remain competitive in the enterprise market and how these approaches can be shared by other open source projects. 

by Owen Lansbury / 12 March 2024

The original of this video recording was first published on the FOSDEM website

Drupal has historically had no centralised product management or marketing, let alone ANY coordinated budget! For comparison, Adobe spends around USD$2.7bn annually on product development, sales and marketing for its Experience Cloud product suite. 

In the talk, I discuss Drupal's recent recognition as a Digital Public Good and the way that the Drupal community is highly motivated by providing world-class software for free to anyone who wants to use it, promoting values of freedom, inclusion, participation and empowerment. The Drupal Association recently released a manifesto that defines the Drupal project's commitment to the Open Web, but in order to fulfil this mission, Drupal needs to be successful as a product in the open market.

Since Drupal 8 was released in 2015, it has been specifically targeted at building "ambitious digital experiences." While this has resulted in an overall drop in Drupal installs as smaller sites move to SAAS platforms, the Drupal economy is robust, with an estimated USD$3 billion spent on Drupal-related projects each year.

Unlike other open source projects, Drupal doesn’t have a single company doing the majority of the code contribution. The Drupal Association has run on a budget of around $3.5m or 1/1000th of the revenue being spent on Drupal projects each year. 

This was brought into focus for the Drupal Association during COVID when the primary source of income - running DrupalCon events - required an abrupt rethink. We had to refocus on how Drupal would be both successful and sustainable in the future. This has led to us recently embarking on a new strategy, where the Drupal Association play a more direct role in both Drupal product innovation and marketing.

Enterprise customers are key to maintaining a healthy ecosystem for a CMS. Their investment flows through to the agencies building, maintaining, supporting, and hosting large-scale projects, providing consistent, repeat income that ultimately benefits our open source community in the form of stable jobs, community funding, and sponsored code contribution. 

Looking more closely at the challenges of succeeding in the enterprise market, how do you get access and awareness with key decision makers in large organisations like the CIO, CTO and, increasingly, the CMO (Chief Marketing Officer)? They are the people likely to read analyst reports from Gartner and Forrester. While Acquia features as a leader in these reports and relies heavily on Drupal for its platform, Drupal's name recognition is largely absent from these reports. 

Acquia has also had great success with their Engage events that target key decision makers, but it's been a challenge to attract a similar audience to the more community and developer-focused DrupalCon events. 

While the Drupal Association itself has historically had limited relationships with Drupal's large end users, partner agencies who rely on Drupal's open source software for their clients absolutely do have these relationships.

The Drupal Association is in a strong position to provide our agency partners with as much assistance as possible to either retain or win new enterprise clients through any playbook-style information we can provide. For example, do we have a pitch deck on hand to help an agency argue why Drupal is superior to Adobe or Sitecore? Are there pre-packaged product demos that can be consistently updated to highlight new features?

This is an area where we currently fall short in the Drupal community, with most agencies replicating efforts for every new client engagement. It's something we're starting to address with the Drupal Certified Partner program, however, if we can harness the strength of hundreds of agency salespeople pitching Drupal to their clients every day. New agencies joining a partner program need to see a clear pathway to building their teams' expertise and being able to sell Drupal to their clients to grow their businesses. The largest global digital agencies have tended to struggle with engaging with open source software communities, so bridging that gap is critical.

The other group of people we need to convince in any large organisation are the people who’ll be using our product - the developers, content editors and systems engineers. C-level decision-makers lean heavily on this group to evaluate and make recommendations about what platform they should be considering. To influence this group, our product needs to look and function like a modern piece of software, fulfil contemporary requirements or be quickly downloadable for a working demo of the software.

In terms of where we already clearly win, rapid innovation is the thing that we do very well in the open source world. Maintaining the speed of innovation, though, is an area that has been harder for Drupal as both the software and community have matured. A big philosophical hurdle we’ve faced is the notion of the Drupal Association directing budget to innovation projects when people often have an expectation that contribution is “free”. But contribution has never been free! An individual or company has always borne the cost in personal time or wages. Other big open source projects have absolutely no stigma about funding projects with actual money, such as the Linux Foundation's $160m annual funding towards projects.

The Drupal community dipped our toe into this model last year with the Pitchburgh contest, which saw $98,000 worth of projects get completed in a relatively short amount of time because they had the budget. We’re also in the process of hiring people at the Drupal Association who can facilitate innovation and remove roadblocks to contribution.

Now, all we need is the funding to scale this model up. Imagine if just 1% of the $3bn spent on Drupal-related projects each year went towards funding strategic innovation - that would be a $30m budget to work with!

Similarly, the idea that Drupal would be “marketed” as a product by the Drupal Association has never been a core competency. This is the legacy of being structured as a 501c3 not-for-profit in the USA where funds are for the “advancement of a charitable cause”. Our charitable cause is ensuring Drupal remains a Digital Public Good that supports the United Nations’ Sustainable Development Goals. But if there isn't positive product awareness about Drupal in the broader market, then market share will slip and our ability to support the goals around being a Digital Public Good will suffer as a result. 

Whether we call it marketing or advocacy, we need to ensure Drupal as a product is commercially successful. We’ve had a Promote Drupal working group within the Drupal community for a number of years that has driven a range of broader marketing initiatives. The Drupal Association has now taken on an active role in this by commissioning a go-to-market strategy targeting the enterprise sector. This will be rolling out in 2024 as funding for specific marketing initiatives becomes available. 

At the cheaper end of the scale, this might include coordinating speakers at non-Drupal tech events or managing positive media coverage. At a higher budget scale, it might include Drupal-branded booths at major tech conferences, like the one we recently built for Web Summit in Lisbon, or running global campaigns to build Drupal product awareness. 

Our other huge advantage as an open source community is the strength and depth of our developer pool. We do encounter a perception issue when it comes to attracting younger developers to our platforms because there are so many shiny new things to play with. Building robust outreach, training, mentoring, certification and professional pathways is the key to maintaining a sustainable developer pool as those of us with 20+ years of experience head towards the other side of middle age.

So, where can you start to help with all of this? 

  1. If you're a professional services company that relies on Drupal for your business, get involved with the Drupal Certified Partner program. This is the fastest way to both contribute to Drupal's innovation as a product and play a direct role in driving adoption.

  2. If you rely on Drupal as your organization's CMS software, become a Supporting Partner and help fund Drupal's sustainability. 

  3. If you're passionate about maintaining the Open Web, the Drupal Association can accept your philanthropic donation

  4. Send your team members to DrupalCon or a regional DrupalCamp to connect with the community.

This level of engagement will help Drupal maintain its status as the platform of choice for large-scale projects.

The Drop Times: Zoocha: The Drupal Development Specialists in the UK

Discover the journey of Zoocha, a leading Drupal Development Agency renowned for its commitment to innovation, sustainability, and client success. Since its inception in 2009, Zoocha has demonstrated a profound dedication to open-source technology and agile methodologies, earning notable certifications and accolades. With a portfolio boasting collaborations with high-profile clients across various sectors, Zoocha excels in delivering customized Drupal solutions that meet unique client needs.

This article delves into Zoocha's strategies for engaging talent, promoting Drupal among younger audiences, and its ambitious goal to achieve carbon neutrality by 2025. Explore how Zoocha's commitment to quality, security, and environmental sustainability positions it as a trusted partner in the dynamic world of web development.

Four Kitchens: Custom Drush commands with Drush Generate

Image removed.

Marc Berger

Senior Backend Engineer

Always looking for a challenge, Marc tries to add something new to his toolbox for every project and build — be it a new CSS technology, creating custom APIs, or testing out new processes for development.

January 1, 1970

Recently, one of our clients had to retrieve some information from their Drupal site during a CI build. They needed to know the internal Drupal path from a known path alias. Common Drush commands don’t provide this information directly, so we decided to write our own custom Drush command. It was a lot easier than we thought it would be! Let’s get started.

Note: This post is based on commands and structure for Drush 12.

While we can write our own Drush command from scratch, let’s discuss a tool that Drush already provides us: the drush generate command. Drush 9 added support to generate scaffolding and boilerplate code for many common Drupal coding tasks such as custom modules, themes, services, plugins, and many more. The nice thing about using the drush generate command is that the code it generates conforms to best practices and Drupal coding standards — and some generators even come with examples as well. You can see all available generators by simply running drush generate without any arguments.

Step 1: Create a custom module

To get started, a requirement to create a new custom Drush command in this way is to have an existing custom module already in the codebase. If one exists, great. You can skip to Step 2 below. If you need a custom module, let’s use Drush to generate one:

drush generate module

Drush will ask a series of questions such as the module name, the package, any dependencies, and if you want to generate a .module file, README.md, etc. Once the module has been created, enable the module. This will help with the autocomplete when generating the custom Drush command.

drush en <machine_name_of_custom_module>

Step 2: Create custom Drush command boilerplate

First, make sure you have a custom module where your new custom Drush command will live and make sure that module is enabled. Next, run the following command to generate some boilerplate code:

drush generate drush:command-file

This command will also ask some questions, the first of which is the machine name of the custom module. If that module is enabled, it will autocomplete the name in the terminal. You can also tell the generator to use dependency injection if you know what services you need to use. In our case, we need to inject the path_alias.manager service. Once generated, the new command class will live here under your custom module:

<machine_name_of_custom_module>/src/Drush/Commands

Let’s take a look at this newly generated code. We will see the standard class structure and our dependency injection at the top of the file:

<?php namespace Drupal\custom_drush\Drush\Commands; use Consolidation\OutputFormatters\StructuredData\RowsOfFields; use Drupal\Core\StringTranslation\StringTranslationTrait; use Drupal\Core\Utility\Token; use Drupal\path_alias\AliasManagerInterface; use Drush\Attributes as CLI; use Drush\Commands\DrushCommands; use Symfony\Component\DependencyInjection\ContainerInterface; /** * A Drush commandfile. * * In addition to this file, you need a drush.services.yml * in root of your module, and a composer.json file that provides the name * of the services file to use. */ final class CustomDrushCommands extends DrushCommands { use StringTranslationTrait; /** * Constructs a CustomDrushCommands object. */ public function __construct( private readonly Token $token, private readonly AliasManagerInterface $pathAliasManager, ) { parent::__construct(); } /** * {@inheritdoc} */ public static function create(ContainerInterface $container) { return new static( $container->get('token'), $container->get('path_alias.manager'), ); }

Note: The generator adds a comment about needing a drush.services.yml file. This requirement is deprecated and will be removed in Drush 13, so you can ignore it if you are using Drush 12. In our testing, this file does not need to be present.

Further down in the new class, we will see some boilerplate example code. This is where the magic happens:

/** * Command description here. */ #[CLI\Command(name: 'custom_drush:command-name', aliases: ['foo'])] #[CLI\Argument(name: 'arg1', description: 'Argument description.')] #[CLI\Option(name: 'option-name', description: 'Option description')] #[CLI\Usage(name: 'custom_drush:command-name foo', description: 'Usage description')] public function commandName($arg1, $options = ['option-name' => 'default']) { $this->logger()->success(dt('Achievement unlocked.')); }

This new Drush command doesn’t do very much at the moment, but provides a great jumping-off point. The first thing to note at the top of the function are the new PHP 8 attributes that begin with the #. These replace the previous PHP annotations that are commonly seen when writing custom plugins in Drupal. You can read more about the new PHP attributes.

The different attributes tell Drush what our custom command name is, description, what arguments it will take (if any), and any aliases it may have.

Step 3: Create our custom command

For our custom command, let’s modify the code so we can get the internal path from a path alias:

/** * Command description here. */ #[CLI\Command(name: 'custom_drush:interal-path', aliases: ['intpath'])] #[CLI\Argument(name: 'pathAlias', description: 'The path alias, must begin with /')] #[CLI\Usage(name: 'custom_drush:interal-path /path-alias', description: 'Supply the path alias and the internal path will be retrieved.')] public function getInternalPath($pathAlias) { if (!str_starts_with($pathAlias, "/")) { $this->logger()->error(dt('The alias must start with a /')); } else { $path = $this->pathAliasManager->getPathByAlias($pathAlias); if ($path == $pathAlias) { $this->logger()->error(dt('There was no internal path found that uses that alias.')); } else { $this->output()->writeln($path); } } //$this->logger()->success(dt('Achievement unlocked.')); }

What we’re doing here is changing the name of the command so it can be called like so:

drush custom_drush:internal-path <path> or via the alias: drush intpath <path>

The <path> is a required argument (such as /my-amazing-page) because of how it is called in the getInternalPath method. By passing a path, this method first checks to see if the path starts with /. If it does, it will perform an additional check to see if there is a path that exists. If so, it will return the internal path, i.e., /node/1234. Lastly, the output is provided by the logger method that comes from the inherited DrushCommands class. It’s a simple command, but one that helped us automatically set config during a CI job.

Table output

Note the boilerplate code also generated another example below the first — one that will provide output in a table format:

/** * An example of the table output format. */ #[CLI\Command(name: 'custom_drush:token', aliases: ['token'])] #[CLI\FieldLabels(labels: [ 'group' => 'Group', 'token' => 'Token', 'name' => 'Name' ])] #[CLI\DefaultTableFields(fields: ['group', 'token', 'name'])] #[CLI\FilterDefaultField(field: 'name')] public function token($options = ['format' => 'table']): RowsOfFields { $all = $this->token->getInfo(); foreach ($all['tokens'] as $group => $tokens) { foreach ($tokens as $key => $token) { $rows[] = [ 'group' => $group, 'token' => $key, 'name' => $token['name'], ]; } } return new RowsOfFields($rows); }

In this example, no argument is required, and it will simply print out the list of tokens in a nice table:

------------ ------------------ ----------------------- Group Token Name ------------ ------------------ ----------------------- file fid File ID node nid Content ID site name Name ... ... ...

Final thoughts

Drush is a powerful tool, and like many parts of Drupal, it’s expandable to meet different needs. While I shared a relatively simple example to solve a small challenge, the possibilities are open to retrieve all kinds of information from your Drupal site to use in scripting, CI/CD jobs, reporting, and more. And by using the drush generate command, creating these custom solutions is easy, follows best practices, and helps keep code consistent.

Further reading

The post Custom Drush commands with Drush Generate appeared first on Four Kitchens.

Tag1 Consulting: The DDEV Local Development Environment: Talking with Maintainer Randy Fay

Randy Fay, the maintainer of DDEV discusses the key features and functionalities of DDEV, a Docker-based development environment that streamlines setting up and managing local development for applications (no Docker knowledge is required). Whether you're creating applications in Python, PHP, or other languages, DDEV will save you tremendous time and effort. It also works great for managing multiple projects, or working with a large distributed team, ensuring everyone’s configurations remain in sync. Randy also demos DDEV, showcasing how fast and easy it is to set up a local Drupal development environment quickly. Additionally, he touches upon the history and future of DDEV, and the critical role of the DDEV user community in both supporting the project and shaping it’s development. This interview is perfect for any developer interested in efficient development tools, current DDEV users, or anyone curious about local development technologies and best practices. --- For a transcript of this video, see The DDEV Local Development Environment- Talking with Randy Fay --- ## Links - DDEV: ddev.com - Docs https://ddev.readthedocs.io - CMS Quickstarts https://ddev.readthedocs.io/en/stable/users/quickstart/ - DDEV 2023 Review https://ddev.com/blog/2023-review - [DDEV 2024 Plans](https://ddev.com/blog/2024-plans...

Read more michaelemeyers Wed, 03/13/2024 - 04:40

Liip: Throwback to Drupal Mountain Camp 2024

Against the backdrop of snow-capped peaks and invigorating alpine air, attendees immersed themselves in a whirlwind of workshops, discussions, and outdoor activities showcasing the vibrant spirit of open-source technology and the beauty of the Swiss mountains.

We took this opportunity to ask Josef Kruckenberg, Product Owner at Liip and co-organiser of the Drupal Mountain Camp, a few questions.

Image removed.Josef Kruckenberg, Product Owner at Liip and co-organiser of the Drupal Mountain Camp ©Patrick Itten

What are the goals of the Drupal Mountain Camp?

The Drupal Mountain Camp brings together experts and newcomers in web development to share their knowledge of creating interactive websites using Drupal and related web technologies. We are committed to uniting a diverse crowd from different disciplines, such as developers, designers, project managers, agency and community leaders.

The main highlights include:

  • Pre-conference with skiing, snowboarding and co-working in Davos in the Swiss Alps
  • 3 keynotes on headless CMS, open-source funding and personal development
  • 3 days with workshops, sessions and exchanges around the open-source CMS Drupal

What is your involvement in this, also as a Liiper?

As Drupal Community Coordinator at Liip, Jens Vranckx and I are part of the organising team that makes the Drupal Mountain Camp happen. For this year's edition, I have been focusing on recruiting keynote speakers and inviting speakers from other countries to provide a rich and diverse line-up. I also coordinate with our marketing team, coordinate some logistics at the venue, encourage Liipers to speak, and have fun taking pictures of the event.

In our workshop, Drupal for End Users, Jonathan Noack and I compared the different ways of creating landing pages with Drupal and allowed the audience to test blökkli, our interactive, open-source page-building solution.

As a board member of the Drupal Switzerland association, I’m also organising a Drupal Local Association Updates session that acts as an exchange format for open-source leaders in countries like France, Belgium, and Switzerland.

Image removed.Jonathan Noack presenting blökkli ©Patrick Itten

Which speech inspired you the most and why?

I especially enjoyed Tearyne Almandarez's talk about grit and personal development. It reminded me of how I dealt with difficult challenges in my career, how imposter syndrome can hold one back and how important it is to find clarity about where you want to go, especially if that means you have to go outside of your comfort.

What outcomes would you like to share following this edition?

The Swiss and international Drupal community had a lot to share within the days of the mountain camp.

It's inspiring to see the multitude of approaches to solve key problems, such as interactive page building with Drupal.

I'm proud of the Liip team for contributing substantially to open-source by sponsoring and co-organising Drupal Mountain Camp and sharing our knowledge in many sessions.

Image removed.Conference of Jutta Horstmann ©Patrick Itten

What are the next challenges?

The Drupal Mountain Camp is all about bringing people together. The organisers will get together, do a retrospective and get ready for the next iteration. What can we do to make it more accessible? Will we do it as usual in Davos? We had a lot of good discussions already at the conference, so I’m looking forward to seeing where we take the organisation next.

For Liip, we will continue investing highly in the open-source and Drupal community. We are excited to see how the community will use blökkli and what they contribute back to it.

Talking Drupal: Skills Upgrade 2

Welcome back to “Skills Upgrade” a Talking Drupal mini-series following the journey of a D7 developer learning D10. This is episode 2.

Topics
  • Review Chad's goals for the previous week
    • DDEV Installation
    • Docker for Mac vs other options
    • IDE Setup
  • Review Chad's questions
  • Tasks for the upcoming week
    • DDEV improve performance
    • Install Drupal 10
    • Install drupal/core dependencies
    • Configure and test phpcs
    • Test phpstan
    • Set up settings.local.php
    • Install devel module
Resources

DDEV Performance DDEV Quickstart Drupal Core Dependencies How to Implement Drupal Code Standards Running PHPStan On Drupal Custom Modules Why you should care about using settings.local.php Rancher Desktop

Chad's Drupal 10 Learning Curriclum & Journal Chad's Drupal 10 Learning Notes

Hosts

AmyJune Hineline - @volkswagenchick

Guests

Chad Hester - chadkhester.com @chadkhest Mike Anello - DrupalEasy.com @ultimike