CTI Digital: Unlock Your Marketing Potential with Drupal 11: An In-Depth Guide

Image removed.

As businesses strive to succeed online, the need for an effective content management system (CMS) becomes paramount. Drupal 11, the latest version of this powerful CMS, is packed with new features and improvements designed to enhance your marketing strategy.

We’re excited to share how these advancements can elevate your marketing efforts and help create exceptional digital experiences for your audience.

Image removed.

Tag1 Consulting: Migrating Your Data from D7 to D10:Migrating view modes and field groups

Today, we're building on our previous work with field widget settings. We will cover migrating view modes, a prerequisite for migrating field groups and field formatter settings. We’ll then walk through migrating field groups. Field formatter settings will be addressed in our next article.

Read more mauricio Wed, 09/18/2024 - 05:36

Drupal Starshot blog: Drupal CMS Update for The Mid September 2024

We are in the middle of September and that means it’s time for our regular update on what’s going on with the Drupal CMS. Let’s check out what’s new!

Documentation

The Drupal Association is on a mission to bring world-class documentation and maintenance software for Drupal CMS launch and beyond. In order to achieve that, we teamed up with Drupalize.me. Our common goal is to make sure that by the time when Drupal CMS hits the market we have an easy to follow user guide suitable for our target audience. Also we prepared an announcement that will be revealed at DrupalCon Barcelona. Stay tuned for the details!

Contact Form

We are happy to announce that we found a track lead for the Contact form! J. Hogue from Oomph joined the team very recently but has already managed to show progress: he and his team are getting busy with the research and MVP mapping. It’s great to have you with us!

Blog

As per the most recent update from Laurens Van Damme and his team, The MVP version has been set up and now the team is busy with the research on how to align with Drupal CMS standards.

Events

While Martin Anderson-Clutz and his team are considering expanding functionality with the options that will not be applied by default, they are working on a different calendar solution that would have more community support, as well as UX improvement for the date widget. We can’t wait to see the result of their work!

Data Privacy / Compliance

Jürgen Haas tells us that information research and framing the scope of the track has been completed and the team agreed on the documentation. Therefore, the following 3 action items has been set as immediate next priorities:

  • continue documentation

  • break down existing feature list into deliverables/recipes and prioritise them

  • define components for the "Compliance Audit" module as an additional deliverable

Trial experience for Starshot

Some exciting news is coming from Matt Glaman: the trial now displays an interactive installer of Drupal CMS. Meanwhile, the work on styling so the trial would look like the Drupal CMS installer while it is being set up, is full steam on.

Dashboard

The team, lead by Christian López Espínola and Matthew Tift has got the wireframes they can rely on hence now it’s time to get things rolling! They are busy looking deeper into the Gin theme, in particular - config actions for adding blocks to the dashboards from other recipes. There is a decision to be made on whether the right sidebar should consist of shortcuts or individual blocks. At the same time, planning for the upcoming activities is in progress and we are waiting patiently to see more details on what’s ahead.

SEO

Great news from Jim Birch and John Doyle with the SEO track team: Basic and Advanced SEO Recipes have been committed to the repository. Next priority is to continue the iteration process on the recipes, documentation, and guidelines for other tracks.

Content Publishing Workflows

We are most excited to announce that we’ve got one more valuable addition to the team - Mohammed Razem from Vardot is joining the Drupal CMS crew as a track lead of the Content Publishing Workflows track. Welcome on board!

Advanced Search

The 1xINTERNET team has been busy finalising the first version of the concept. The next target for them is to get insights from the survey asking Drupalers what they prefer to use in order to confirm earlier findings from the specification.

Media Management

Tony Barker informs that the Media Track team is researching the features of content management systems identified in the Strategy document as well as working through the information and ideas from the earlier released questionnaire. They keep experimenting with modules and configuration to make necessary choices, with the focus on features that can make it into early recipes over the coming weeks.

Accessibility Tools

From Gareth Alexander we learn the following: as the discovery process and gathering insight on common practices has been finalised, the team has published a survey and is now busy reviewing the results. As review of the current module availability has been completed the list has been simmered down to the ones now being reviewed for feasibility.

This will lead to a proposal for the features and recipes for Accessibility Tools to be considered for inclusion in early versions of Drupal CMS.

Proposal creation is underway and the next steps are being generated.

Analytics

Dharizza Espinach meanwhile shares that the team has finished the market research, and is currently wrapping up the work on a comparison with other tools. The selected tools are to be included in the recipes. Another objective the track team is busy with is preparing the list of recommendations for features that should be included later in the project. The proposal document is underway and iterating over a first version of the basic recipe will be the next step.

AI

Jamie Abrahams is working on something very special that will be released during DrupalCon Barcelona. So I will keep the intrigue and will allow you to discover the details for yourself in just 1 week!

New Track Announcement!

As we make progress with the already defined deliverables, we keep discovering the missing parts of the puzzle. In order to close those gaps, we are excited to announce 2 new tracks we not only set up but managed to get staffed as well:

Our heartfelt welcome to all of the newly acquired Drupal CMS track leads - we are excited to have you and looking forward to all the expertise you are bringing along!

I truly hope you find the news outlined above as exciting as we do and look forward to sharing even more at DrupalCon Barcelona. So if you somehow didn’t get your ticket yet - better hurry up!

LN Webworks: How To Render A Custom Form In Drupal Block

Image removed.

Blocks are basically content containers that are showcased in distinct sections of a website including social sharing buttons, “Who’s online” sections, recently viewed content, and social media feeds. Custom Blocks that are available through GUI operate similarly to node entities. In this guide, you will get a concise overview of their implementations, but primarily focusing on custom blocks developed through the creation of a custom module in Drupal development services.

Step-by-Step Guide To Displaying Custom Forms In Drupal Blocks

Step 1: Create a Custom Module 

Create a directory first by using the following command: =>

mkdir modules/custom/mymodule   

Create the Necessary Files


Within the modules/custom/mymodule directory, create the following files:

mymodule.info.yml
src/Plugin/block/CistomBlock.php
src/Form/MyCustomForm.php

Ensure that these files are properly structured to fit the requirements of your module.

1. mymodule.info.yml

Horizontal Digital Blog: Why you shouldn't upgrade from Drupal 7

Image removed. Yes, that title borders on click-bait, but fundamentally it is true. While it is possible to use the Drupal upgrade tools to migrate a site off of Drupal 7, there are reasons why almost 9 years after the release of Drupal 8 there are still thousands of sites on Drupal 7. If it was easy to upgrade your way to a newer platform, you would have done it by now.

The Drop Times: Reimagining the Limits of Possibilities

Dear Readers,

"Imagine a world where creating or cloning any application becomes as simple as issuing a command: "Build me that."

The capabilities of AI are evolving at a fast pace, and the developments that have been brought forth are unprecedented. The recent demonstration of the O1 model by Reuven Cohen is a striking testament to this progress. Imagine taking a complex content management system like Drupal, traditionally built on PHP, and seamlessly transforming it into a JavaScript-based platform—all within an hour. This is precisely what the O1 model has achieved. Reuven tested this revolutionary AI by converting the entire Drupal CMS into Node.js, and the results were nothing short of astounding. The model didn't just convert code; it reimagined and replicated the whole system in a new language, paving the way for revolutionary changes in software development. 

By providing the O1 model with a detailed prompt, including Drupal's structure, API, and taxonomy, it generated a complete implementation plan in mere seconds. 

"It seamlessly transformed the codebase from PHP to JavaScript," 

Reuven noted, emphasizing how the model made cross-language conversion appear effortless. Once the backend was replicated into what can be called 'Drupal.js,' the process moved on to the front end. Utilizing GPT-Engineer, a UI was designed, wrapping up the entire transformation process in just about an hour.

What makes the O1 model truly exceptional is not just its ability to replicate; it goes beyond to fully clone and transform. The AI didn’t stop at code conversion—it provided a clear specification, built the necessary folder and file structure, and even automated the setup process with a bash script. This is more than a mere glimpse into the future; it is a tangible demonstration of how software development could evolve. 

As we stand on the brink of this new era, it's clear that these advancements will redefine our approach to software development and open-source projects like Drupal. With tools like the O1 model, the boundaries of what's possible are expanding, making this an exhilarating time for developers and tech enthusiasts alike.

With that, let's move on to the stories from last week.

In an interview with Kazima Abbas, a Sub-Editor of The DropTimes (TDT), Julian Chabrillon, a seasoned full-stack developer at ES IMAGINACION, shares his inspiration behind Noah's Page Builder and how he developed the tool. With the help of his wife and work partner, Sofía García de Blas, he shaped its visual identity, creating a cohesive and user-friendly experience.

Among other notable stories, The DropTimes conducted a video interview with Piyuesh Kumar, the Director of Technology at QED42. Starting as an intern and with almost 14 years with QED42, he watched Drupal evolve from a basic content management framework into a powerful digital experience platform. Piyuesh also shares excitement about his upcoming sessions at DrupalCon Barcelona, where he’ll explore topics like AI’s role in supporting Digital Experience Platforms (DXP) and the importance of designing with privacy in mind, particularly in the context of GDPR.

Last week, Drupal.org refreshed its website with new fonts, transitioning from the previous fonts, Ubuntu and Ubuntu Sans, to ZT Gatha and Noto Sans. This update improves the platform's visual appeal and usability, reflecting the ongoing effort to modernize its design and user experience.

DrupalCon Barcelona is just a week away, and The DropTimes has published several articles and lists to guide attendees through the event. DrupalCon Barcelona 2024, happening from September 24-27, will feature several sessions focused on the Drupal Starshot Initiative, a project aimed at simplifying Drupal while incorporating advanced technologies like AI, no-code tools, and browser-based development. Take a look at 10 key sessions that will provide valuable insights into Drupal’s next steps.

This year’s event also promises to be an exciting convergence of ideas and innovations, with AI taking center stage in a range of sessions. As AI technologies continue to transform the way we build and interact with digital experiences, DrupalCon Barcelona 2024 will feature 13 AI-focused sessions, offering attendees the chance to dive into the latest advancements and explore how AI can revolutionize the future of web development.

Building on the success of DrupalCon Lille 2023 and continuing the partnership with TerraVerde Sustainability, the DrupalCon Barcelona 2024 organizers remain committed to advancing sustainability initiatives and creating lasting positive change. As DrupalCon Barcelona 2024 embraces a greener future, five key sustainability initiatives highlight the event’s commitment to reducing its environmental impact.

While you are there, the Drupal Association invites attendees of DrupalCon Barcelona 2024 to visit its booth to engage with the team, learn more about their ongoing initiatives, and explore ways to contribute to the community. 

Meanwhile, we have published a list of Drupal events happening this week for Drupal enthusiasts to attend. Find the content here.

The FOSSEPS and OSOR projects are conducting a survey to assess interest in forming a European Open Source User Group for public administrations. The initiative seeks input from IT professionals within EU public bodies on the current use and challenges of Free and Open Source Software (FOSS).

Darren Oh has introduced the business model behind Drupal Forge, focusing on sustaining Drupal’s software and infrastructure through vendor contributions. Under this model, vendors can partner with product owners, such as the Drupal Association, by offering product trials via launch buttons integrated into their hosting platforms. 

Kevin Quillen has taken up maintainership of the Netlify module for Drupal, a key integration that ensures seamless syncing between Drupal and Netlify. This module enables automatic triggering of builds in Netlify whenever changes are made to Drupal’s content or configuration, allowing headless websites, particularly those built on frameworks like Next.js, to stay current without caching issues.

As a final update, Monarq has developed a new system called Sections & Components to simplify content management in Drupal, enhancing user experience by making the platform more intuitive and flexible. This system enables site administrators to build customized, responsive pages by combining predefined sections and components such as text blocks, images, carousels, and call-to-action buttons.

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now.

To get timely updates, follow us on LinkedIn, Twitter and Facebook. You can also, join us on Drupal Slack at #thedroptimes.

Thank you, 
Sincerely 
Alka Elizabeth 
Sub-editor, The DropTimes.

Specbee: Cooking up irresistible Drupal websites with Recipes

Drupal's ongoing evolution has seen many innovations, with the latest being the introduction of Recipes with the “Recipes Initiative” in Drupal 10.3. Recipes, now part of the core of Drupal, represent a significant shift in how developers can automate the setup and configuration of Drupal sites. A Recipe explores the idea of “composibility” which will enable people to compose a Drupal website as per the need or at least a solid foundation. In this article, we’ll discuss Recipes in detail - what they are, why they’re fantastic, and how you can use them to create a perfectly crafted site. Get ready to cook up a storm with a foolproof recipe for Drupal success! But we already have Distributions! The concept of pre-configured packages is not new to Drupal. It was first introduced in Drupal 5 as Drupal distributions that include the Drupal core, along with additional modules, themes, and configurations aimed at serving a specific use case or industry. This concept made it easier for developers to quickly set up Drupal for specific applications like intranets, e-commerce sites, or government portals without starting from scratch. However, Drupal distributions offer a convenient way to get started with pre-configured setups, but they come with some drawbacks: Limited Flexibility: Predefined features and tightly integrated modules make customization difficult. Maintenance Complexity: Updating distributions can be challenging due to custom configurations, leading to potential compatibility issues. Dependency on Maintainers: Some distributions may be poorly maintained or abandoned, causing risks to security and updates. Performance Overhead: Unnecessary bundled modules can slow down the site and introduce vulnerabilities. Niche Focus: Distributions are often tailored to specific use cases, making it difficult to adapt if your needs change. Drupal Recipes solves problems with distributions by offering more modularity. Instead of coupling everything together, Recipes let you add only the specific features you need, avoiding the bloat of unnecessary modules. A Recipe in action To illustrate, imagine I need to set up an Event feature. For this, I will apply (Recipes are not installed but rather applied) an “Events Content-Type” Recipe which will set an Event content type with necessary attributes & fields, and configure views, metatags & paths for the Event contents. This will give me a solid foundational starting point to implement the feature where 70-80% of the basic setup and configurations are done by Recipes and on top of it I can make customizations to configure other settings as per my requirements. However, once applied, I am no longer dependent on the Recipe package anymore and it can be safely removed from my project keeping all the configured setup intact. Benefits of Drupal Recipes Modular Setup: Recipes allow for specific features or configurations to be added individually at any point in a project timeline.  Combine Multiple Recipes: Recipes can be easily combined or modified to fit specific use cases. This allows for a more customizable site-building experience, making it easier to adapt to changing requirements. No Lock-in: Unlike distributions, which are often tightly integrated, recipes give you more freedom to swap out or upgrade parts of your setup without being tied to a rigid structure. Composable: As mentioned above you can combine multiple recipes which means you can also compose Recipes with other recipes easily. If you want Event registration but also Commerce capabilities, you can easily create a new recipe that will apply the Event and Commerce recipes to be set up. What Recipes can do Install Modules and Themes: Recipes can automatically install necessary modules and themes. Apply Configurations: Recipes can apply both default and selective configurations provided by modules. Update Configurations: Recipes can update module configurations to fit your site's needs through' config actions'. Composable and Reusable: Recipes can be composed of other Recipes, making them highly modular and reusable across different projects. What Recipes cannot do Custom Code or Hooks: Recipes do not include custom PHP code, hooks, or API integrations. Module-Like Functionality: Unlike modules, Recipes cannot contain custom plugins, forms, or other typical Drupal module structures. Persistent Locking: Recipes do not persist after application; they set up the initial state and can be safely removed. Want to take full advantage of Drupal Recipes? Schedule a consultation with our experts and discover how our Drupal development services can help boost your site's growth! How to create and use a Recipe To get started, it is recommended that a new custom repository is created for the recipe. This will ensure that the recipe is version-controlled & managed efficiently to be used on other projects.  At the bare minimum, a recipe will require a “recipe.yml” file as the required file which will define the meta information like name, and description along with installations of modules/themes & configuration installations/updates. Apart from the “recipe.yml” a Recipe can also have the below optional items. The “config” directory holds the configuration entity yml files which will be installed when the Recipe is applied. The “content” directory holds the content entity yml files which will be created after the Recipe is applied. A “composer.json” file that allows the discoverability of Recipes via Composer. It will define any dependencies the Recipe might have on other modules or themes. A “README.md” can also be included to give a brief description of the Recipe which will allow users to better evaluate. So, the folder structure of a Recipe can look like this: recipe_name       ◦ recipe.yml       ◦ config           ▪ node.type.event.yml       ◦ content           ▪ node               • 43940d31-0106-46b4-ba32-39e511eb1f4a.yml       ◦ composer.json       ◦ README.md Dissecting the recipe.yml file A “recipe.yml” at minimal consists of “name” & “description” keys. Apart from that, it can include three different keys: Install packages with the “install” key which will specify the modules and/or themes to be installed when the Recipe is applied. If not already installed, Drupal will proceed to install each of the modules & themes specified in the list. install:   - address   - datetime_range   - media   - media_library   - geolocation_address   - geolocation_leaflet   - layout_builder   - metatag   - pathauto   - paragraphs   - smart_date Configuration related task under “config” key. This allows you to “import” configurations from a module in two ways     • Import all the configurations from a module with the “*” wildcard. This will import all the base configurations & optional configurations from a module.    • Import selective configuration from a module by specifying the list of configurations to be imported. config:   import:     media: "*"     node:       - views.view.contentWhen you want to update any active configuration which is being imported or any existing once, “action” comes into play which will allow you to update those configurations config:   actions:     metatag.settings:       simple_config_update:         entity_type_groups.node.event:           - basic           - advanced     workflows.workflow.editorial:       addNodeTypes:         - event Dependency on other recipes can be included under the “recipes” key which specifies the list of Recipes to be applied prior applying the current Recipe. recipes:   - core/recipes/image_media_type   - core/recipes/editorial_workflowApplying a Recipe Until reaching Phase 2 and a user interface (UI) for easier application, recipes are currently applied using Drupal core's PHP script. It can be executed with the following command in the CLI Drupal root: php core/scripts/drupal recipe recipes/recipe_nameAfter applying the recipe, it's also necessary to clear the caches to ensure the changes take effect. Applying a hosted Recipe If you Recipe has a “composer.json” file then it can be hosted on Packagist.org to make it discoverable and included in any project. However, to download the recipe in a “recipes” directory there are few changes which are required to be made in the project’s “composer.json” file composer require oomphinc/composer-installers-extender:2.0.1Update “installer-types” & “installer-paths” keys in the “composer.json” "installer-types": ["drupal-recipe"], "installer-paths": {   "web/recipes/{$name}": [ "type:drupal-recipe" ] }When you request Composer for a recipe, it will automatically place it in your project's “recipes” directory, similar to how it handles modules or themes. Once this is setup, you then you can use composer to require the package as usual.For this example, I am using a sample recipe which set ups an Event content type along with other configurations updates.  https://github.com/malabya/event_content_type/ composer require imalabya/event_content_type Once downloaded this can be applied with the Drupal core scripts php core/scripts/drupal recipe recipes/event_content_type Once the recipe is applied this will create an Event content type and enable Editirial workflow for the Event content type with default core configurations. This will also create 2 default contents for the Event content type to get started. Unpacking Recipes Even though Recipes do not lock in your site and can be safely replaced or removed once applied, you would want to maintain the dependencies in your project. When you request a Recipe, it will download all the dependencies mentioned in the Recipe’s “composer.json” file, but these dependencies are not copied/unpacked over into the project’s “composer.json” which will make it difficult to maintain or upgrade those dependencies. For this, use the composer plugin Drupal Recipe Unpack Composer Plugin to your project’s “composer.json” which allows the extraction of a packages dependencies into the project root composer and lock files for the sole purpose of implementation within Drupal recipes. Once the plugin is installed you can run the below command to unpack the dependencies composer unpack [organization/package-name]Final thoughts Recipes in Drupal 11 represent a powerful new tool for site builders and developers. They offer flexibility, modularity, and ease of maintenance that surpass traditional installation profiles and distributions. When it comes to creating a new Drupal site or adding new features, Recipes provides a streamlined, efficient way of managing the configuration. As Recipes continues to evolve, they promise to make Drupal site development more agile and responsive to changing needs, ultimately making Drupal a more accessible and powerful platform for all users. So do you want to cook from scratch or use Drupal Recipes? Your choice! Want to know how Drupal Recipes can enhance your project? Reach out to us to learn more about our Drupal development services and get started on your next big idea!

drunomics: Why we don't use GraphQL

Why we don't use GraphQL wolfgang.ziegler Mon, 09/16/2024 - 21:38 Exploring drawbacks of GraphQL in decoupled Drupal, including complexity, loose contracts, performance issues, and security concerns. RESTful alternatives are discussed. Body

At drunomics we are building decoupled Drupal sites for more than five years. During this time, GraphQL has always been a popular choice for decoupled Drupal sites among professional or enterprise projects, thanks to the well maintained GraphQL contrib module. Still, I've vetted against using GraphQL for various enterprise projects, even though sometimes it was appealing to customers. In this blog post, I'd like to summarize why we don't use GraphQL:

General complexity

GraphQL is not only a new query language to learn for both frontenders and backenders, moreover the backend has to support any kind of queries the frontenders make. On the frontend side of things, additional libraries and tooling is needed to handle the protocol.

Loose contracts

GraphQL gives a lot of power to frontend developers, but that comes with a huge price: No defined or a very loosely defined contract, i.e. the data model or more specifically the GraphQL schema layered on top. Based upon this loose contract the frontend may compose any kind of queries, which the backend has to support. What leads to the next point:

Complex queries

When the backend is exposing the Drupal data schema directly, potentiallly a lot of things become leaked unwanted and changing things might became hard, because: Who knows what data properties the frontend uses and queries for? It's quite hard to optimize for every use-case.

However, the backend may compose it's own GraphQL schema and provide exactly the data model as needed by the client, the frontend. That's indeed, a great option to have, but it requires additional work and code to translate between the schema and the real data model behind. It makes it possible to change the underlying data model and schema mapping, while staying with the same or compatible GraphQL output and schema. But is that code performing the mapping performant enough? Does it work correctly? That's quite hard to tell without knowing exactly the queries one has to optimize and test for. So things are or become complex.

Performance

First of all, GraphQL is bad for caching since it makes use of POST requests. The typical work-around is to use shortened, hashed queries and to access them via GET requests, what can help to mitigate the issue. But this comes at the cost of tying the deployed frontend and backend versions, thus increasing overall system and deployment complexity. That way, the main GraphQL advantage - flexibility at the frontend - gets lost. So not an easy or great compromise to make.

Client driven data fetching

With GraphQL, the web browser (or generally the client) sends a query to the server, specifying the exact data it needs. While this can help to reduce payload size, it puts the client in the "driving seat". That often leads to additional round trips being required: Based upon the first request, often additional data is required for rendering it. This additional data often has to be fetched in additional requests, thus requiring another or multiple round-trips to the server and thus increasing latency.

In contrast, when the server is in the "driving seat", it may efficiently do all queries and resolve additional data, and then send the resulting data over the slower network once.

Security

GraphQL queries can expose sensitive data if not properly secured. This can be mitigated by implementing proper authentication and authorization mechanisms. However, this can get very complex easily: Since the server does not know the queries needed by the client, it needs to handle every possible combination a client may request. Unfortunately, it's commonly rather easy for hackers to purposely write computationally very expensive (GraphQL) queries and to send them to the server, thus opening the door for DDOS or even DOS attacks.

Besides that, due to the complexity of the backend having to cover all possible combinations, the danger for data leaking accidentially becomes rather high.

The conclusion

GraphQL comes with a couple of issues, which are - as usual - solvable. That's a price one might want to pay in certain situations, if the benefits are worth it. Thus, is using GraphQL a good idea? As so often, it depends. But in my experience, it's more often not, than it is.

Alternatives are RESTful

The typical alternative to GraphQL is a RESTful API. As usual, with Drupal there are a couple of good options:

  • Drupal comes with the JSON-API out-of-the box, which is a great feature to have. While it's good fit in certain situations, it also faces some of the issues mentioned above, most notable "Client driven data fetching" and "Loose contracts".
  • Developers may use Drupal's API to provide custom-coded RESTful endpoints for the client. That addresses all mentioned concerns, but requires backend development time for every feature and most notable careful planing. This comes with the downside of frontend developers loosing the flexibility. (By the way, this is what GraphQL is loved for!)
  • Configurable RESTful endpoints. In order to improve the development process and gain flexibility in the frontend, we developed a solution for providing custom RESTful endpoints that are configurable via Drupal, by frontend developers. For that, we improved the Custom Elements module, which is part of Lupus Decoupled Drupal, such that it integrates with Drupal's configuration sytem and provides an UI for customizing output by entity view-mode. That way, in many situations, we can tick all the boxes, while enabling the frontend developer to work efficiently. I'll share more details about the new Custom Elements UI in a dedicated blog post later this week.