The Drop Times: Migration from Drupal 7 Simplified as Acquia’s Innovative Tool Goes FOSS

The Drupal ecosystem has welcomed an invaluable addition as Acquia released its Acquia Migrate: Accelerate (AM:A) module as a Free and Open Source Software (FOSS). Designed to ease the often daunting task of migrating from Drupal 7 to Drupal 9, AM:A is now accessible to all users via its Drupal project page.

LN Webworks: 10 Incredible Tips for Drupal Site Maintenance

Image removed.

Almost everything in the world from our bodies to structures requires proper and timely maintenance to preserve its strength and exhibit effective functioning. Even business websites are no exception. Regular Drupal site maintenance is an integral requirement to keep your business website performing at an optimal level. Outdated software often lacks cutting-edge security tools, new features, and functionalities and exhibits a compromised speed. 

Besides, not matching strides with the latest updates can lead to a competitive disadvantage. That said, the stakes of relying on an outdated CMS version are quite high. This blog shines a light on actionable tips for effective and proactive Drupal site maintenance. 

PreviousNext: The Pitchburgh Diaries - decoupled Layout Builder Sprint 1 & 2

Welcome to the Pitchburgh Diaries, a fortnightly update on our progress as we work on our plan for a decoupled Layout Builder using React.

by lee.rowlands / 29 September 2023

Highlights

Let's start with a quick overview of what we've been working on:

  • Evaluation of JSON:API
  • Design of API and Open API specification
  • Progress towards persistence layer
  • Local development setup
  • API for formatters in React
  • API for Block plugins in React
  • API for Layout plugins in React
  • Evaluation of React drag and drop libraries
  • Drag and drop support for blocks and sections

Background

You may also be wondering what this is all about!

In case you missed it, at DrupalCon Pittsburgh we successfully pitched to build a decoupled version of Layout Builder powered by React. In addition to that initiative, our team has been working on bringing editorial improvements to Layout Builder in core. My colleagues Daniel Veza, Adam Bramley and Mohit Aghera have been working on this during our internal contribution time and company-wide innovation days.

As part of this, we've been collaborating with key contributors including the subsystem maintainer Tim Plunkett and Drupal core's product manager Lauri Eskola.

Lauri has been doing user research into Layout Builder pain points and trying to frame the long-term direction for Drupal's page building experience.

Lauri was able to take us through some wireframes he's been working on in this space and these aligned well with our plans for a decoupled layout builder.

I encourage you to review these and provide feedback in the core ideas queue issue.

So, now you know the background, let's get into a summary of what we achieved in the first two sprints.

JSON:API analysis

One of the deliverables for our bid is to create a persistence layer in Drupal to allow retrieving and updating the layouts from the decoupled editor.

With JSON:API in core, this felt like a natural starting point. We spent time reviewing the current patch to add JSON:API support to layout builder. There are some issues with this patch as follows:

The JSON:API specification states that relationships must be present as a top-level field in a resource object. This is so related content is addressable.

With layout builder data there are a number of related resources as follows:

  • Sections
  • Components (block plugin instances)
  • Any referenced content these block plugins refer to. For example, the inline block plugin references a block content entity. A field block plugin might reference a media entity from an entity relationship field.

The current patch just emits the section and component data as is and does not provide a way for these relationships to be surfaced at root of a resource object under a relationships key.

Complying with the specification

So if we were to take a step back and think about how we would provide Layout Builder data and comply with the JSON:API specification – how would it look?

Going back to the data model we would need to allow addressing sections and components.

This would allow us to have a 'sections' field in the top-level resource as a relationship.

However, at present sections in Layout Builder data do not have identifiers. There is an open issue to add this but that's only part of the issue. The larger issue is that if we needed to fetch a section resource, we have no way of retrieving it by ID. This is because Layout Builder data is stored in a blob inside a SectionList field on the entity to which the layout is attached. To achieve that we'd then need to maintain some sort of 'component index' table. This would allow us to retrieve the layout entity based on a section ID. We would probably have to vary each section resource based on the layout plugin, just like we do for nodes by node-type. For example, we'd have a section--onecol and a section--two-col resource.

So let's say we had that. The next issue we have is we need to be able to address components. These do already have a UUID, but again that information is only stored in the field blob. That means we'd need to extend our 'component index' idea to also support looking up a component.

Then from there, we need to be able to distinguish between fields/properties on a per-component basis. For example, an inline block component has a reference to a block content entity via a relationship. Other components (block plugins) don't. We would therefore need to vary resource types by block plugin ID. As a result, we might have component--inline-block and component--field-block--node--field-media.

With all those pieces in place, we should be able to achieve addressable resources for all of the related data in Layout Builder. However, the query to retrieve all the information we need would be quite involved, as it would need to traverse from node to section to component to block content or media.

All of this is only thinking about retrieval. We also need to be able to persist changes, particularly to block content for inline blocks. When a layout is updated we would need to write changes to the block content, then to each component, then to each section and finally to the entity the layout applies to. Each of these would be a separate HTTP request. What happens if one of those requests fails? How would we recover from that state? JSON:API has already thought of this issue and has an atomic extension in the specification.

However, there is no existing implementation of this extension for Drupal, so we'd also need to write that.

Considering all this work and the limited size of the budget for this project, we decided that JSON:API isn't a good fit at this time. We intend to open a meta and child issues for each of the discreet pieces of work that would be required to build a JSON:API compliant endpoint for Layout Builder.

Specifying our own API

As we decided not to use JSON:API, we were able to design an API specifically tailored to the work we're doing. But we need to make sure to document it. Open API is the best way to do this as we can auto-generate rich documentation with tools such as Swagger UI.

We have finalised the API specification in the Decoupled LB API module on Drupal. Oh, and did I mention that Gitlab automatically turns this into interactive documentation with Swagger UI. 😮😍

We have started work on implementing this and have working versions of the following operations including test coverage:

  • getSectionsAsJson
  • getBlocksAsJson
  • getLayoutAsJson

Part of this involved some key decisions around how block plugins in Drupal will interact with React. More on that later.

Collaboration with the Gutenberg pitch

There is some overlap in the work we're doing with the Gutenberg project, another Pitchburgh grant winner. As a result, we've been catching up with that team to ensure we aren't duplicating effort.

One such issue we identified was that both projects will need a shared way to load React from Drupal. At present, the Gutenberg module includes a transpiled version of React and React DOM and is loading these from a libraries definition.

As we will also need to do similar and we don't want to end up with two versions of React loaded from Drupal, we identified that we'd need a common shared module to do this.

We approached the maintainers of the Drupal 7 React module and asked if we could be added as maintainers to use this namespace for that purpose. The maintainer David Corbacho was happy to oblige. Thanks, David!

Local dev setup

Our next step was to get a local development setup that allows Front-end developers to contribute to the project without needing to setup Drupal. This includes tooling that front-end developers expect including:

Setup of redux for state management

To keep track of the layout state, we chose Redux to support global state management. One of the features present in Lauri's wireframes is Undo and Redo support. Keeping track of state with Redux means this will be simple to implement when the time comes. Using redux also allows us to separate state manipulation from the UI and this means we can easily write unit-tests without needing to simulate interaction.

Selection of Drag and Drop library

A key tenet of the layout editing experience is the ability to drag and drop blocks (components). Currently Drupal core supports moving blocks, but not moving sections. There is an active issue to add support for re-ordering sections. One of our goals is to support that functionality from the onset.

Drag and Drop in React is not unique to this project so we did an evaluation of existing React drag and drop libraries. The two we focussed on were react-dnd and react-beautiful-dnd.

After reviewing both packages we decided to go with react-beautiful-dnd. It features keyboard control, auto-scrolling of drop containers and screen-reader support. It was written by Atlassian (creators of Jira, Confluence etc). The only item of possible concern here was that it is no longer under active development. It is however still doing releases for security issues etc. We rejected react-dnd because of its lack of built-in support for keyboard control, screen reader and scrolling. There is also an active security report in its issue queue that has not yet been responded to. 

If we get into issues with the 'no further development' status of react-beautiful-dnd, there is a sympathetic fork, however, it hopefully won't come to that. Another plus in the react-beautiful-dnd column is that it was used by puck, an existing React-powered layout editor. We evaluated that as part of this project and it provides some great inspiration, but doesn't fit with Drupal's data model or the longer-term goals for in-place editing seen in Lauri's wireframes.

After selecting this we set about building a prototype using it to validate the API and have dragging blocks and sections working.

Image removed.Screenshot of dragging and dropping videos, click to view video

Data model design in React

One of the driving factors behind the design of the layout builder in React is it must map back to existing concepts in Drupal. We're building a layout editor, but Drupal will still need to be able to render the layout in the front end once the layout is saved.

To manage this we've designed an API for React components that is familiar to concepts in Drupal. We have built proof of concepts of Block plugins, Layout plugins and Formatter plugins.

We've built a hook system to support retrieving entity view display information client-side.

To validate all of this we built an InlineBlock React component that maps to the InlineBlock plugin in Drupal core. The video above showing blocks being dragged around makes use of these components. Each component returned from the API includes a plugin ID and in the proof of concept these map to a React component. 

In the example, you see the InlineBlock component taking the values from the API, reading entity view mode information and using that to hand off rendering to formatter plugins. Similarly, the API returns sections using their Drupal plugin ID. These are mapped to React components so a one-column component handles the one column seen in the example and a two-column one handles the two-column component.

This ensures that each component is in control of its markup, just like in Drupal with Twig. Regions are handled by a Region component that abstracts away the drag-and-drop functionality. Each Layout plugin just needs to emit the region and even has control over the markup. In the example, the right column of the two-column layout uses an aside element.

Here's how that looks in code:

import React from "react"; import { LayoutPluginProps } from "../../state/types.ts"; import Region from "../LayoutEditor/Region.tsx"; export const Preview: React.FC = ({ section, sectionProps, regions, }) => { return ( ); }; export const id = "layout_twocol_section";

For those of you familiar with a layout twig template in Drupal, this will look pretty familiar. You are given some section props to output on the outer container, just like the attributes you get in twig. You are given a map of region data and you can place the Region component where you need it, the only requirement is that you pass along the ID in the regionId prop. This isn't that different from the region_attributes variable in twig.

All of this is booted from a single LayoutEditor component that takes a registry of block, layout and formatter plugins. As work progresses this will expand to include widgets too.

Each of these registries is a map where the keys are the plugin IDs and the values are a promise that will load the component. In our local development setup, we're using Vite's glob support to automatically load all plugins in a given folder. Our thinking is in the Drupal space we will use an alter hook to add additional properties to plugin definitions and then use drupalSettings to emit a mapping of plugin IDs to file paths. We will be able to use the built-in import function with the file paths to provide the promises. 

So for example, a hook block alter will add a property to each block plugin for the file path of that equivalent React component. This will allow modules and themes to alter that definition and swap out a component provided by default with their own implementation in React that is specific to their project. We will collate all of these when booting the LayoutEditor in Drupal and pass that via drupalSettings into the entry point. We will do something similar for the other plugins.

Approach for normalizing section components

All of this also relies on allowing block plugins in Drupal to have control over the data that is sent in the API. For instance, in the inline-block example, it needs to be able to pass along the block content entity's field values for rendering/editing sake.

The current API handles this using Drupal's existing serializer. The decoupled LB API module adds a normalizer for SectionListInterface objects (how layouts are modelled in Drupal). This loops over each component and normalizes an instance of the relevant block plugin. The default block plugin normalizer just extracts the block configuration. However, due to how Drupal's serializer works, any module can add a new normalizer that targets a more specific class than BlockPluginInterface and give it a higher priority, then modify how the data is sent.

To support working with inline blocks, we've added a normalizer for the InlineBlock plugin that intersects the block content fields with that of the configured view mode and passes those along. In the example above you can see we're rendering the body field of the block. This will also provide us with the implementation point for dealing with persisting updates. We will be able to mutate these properties in the layout as edits occur and send them back to Drupal. The normalizer will be able to denormalize the incoming values and put them back in the format Drupal expects so that saving the layout works the same as it does now.

Next steps

We've made significant progress in the first two sprints. Our high-level goals for next sprints are:

  • Finalising the persistence layer in line with the Open API specification
  • Adding support for widgets and editing

Thanks

Thanks to my colleagues at PreviousNext who've been a great sounding board in validating some of these design decisions, in particular Daniel Veza who has helped to workshop many of these ideas.

Tagged

Pitchburgh

Evolving Web: What's Your Future With Drupal in an AI-Led World?

Image removed.

No one wanted to be left behind when the internet was gaining popularity in the 90s. Everyone was creating websites and experimenting with online communication—yet few really knew what they wanted to get out of it. The same thing is happening now as generative AI is taking hold. 

We’re on the cusp of a technological paradigm shift and everyone is clamouring to get ahead of the curve. But we can’t foresee all the opportunities AI will unlock, or all the new problems that it will bring. As a result, many people are acting and investing without clear direction. 

Over the past five years I’ve seen an increase in clients wanting to integrate AI into their apps and sites. Yet few of them have specific goals for AI to help them achieve. Instead, they’re operating under the misconception that innovation is always good.

Meanwhile, investments have created a technology bubble where growth happens for growth’s sake. Investors are keen to buy into AI startups (which require significant capital to purchase GPUs), as well as SaaS companies (which are moving to the cloud with the promise of infinitely scaling up).

All this means that innovation is being driven by the emerging technology itself, instead of by society and people. And that’s a problem. 

How the Open Web Can Help Vet and Harness AI

As technologists, we have a responsibility to stay on top of new technologies. To read stack overflow, listen to technology podcasts, and try things out on GitHub. We also have a responsibility to question new ideas. To hold them up to the light and see if they bring enough value for the price. 

Both the appraisal and application of innovative technology can be done more effectively through the open source community. Why? Because it provides a diversity of perspectives and expertise, which enables us to hold technology to a higher standard.

Drupal’s Open Web Manifesto speaks directly to this desire to use technology for good. With support from the Drupal Association, the Drupal community has the capacity to make AI technologies more accessible, impactful, and safer for everyone. It can tackle the big picture questions that individuals alone can’t. Things like how to close the privacy gaps in an AI integration and how to evolve Drupal’s functionality to take advantage of AI. 

Your Drupal website is improved each time you update the platform. Sometimes these improvements allow you to leverage technology you didn’t even know you needed yet. For example, Drupal is interoperable and provides a JSON API—a feature that most Drupal websites don’t yet use, but can in the future.

 

“I do think that AI can be the ultimate user experience for a lot of people.”

– Dries Buytaert, Founder of Drupal & Acquia

 

Image removed.Dries Buytaert, the founder of Drupal and Acquia, spoke at EvolveDrupal about where Drupal is going.

AI Will Transform the User Interface, Says Drupal’s Founder

Earlier this month I interviewed Dries Buytaert, the founder of Drupal and Acquia, at EvolveDrupal Toronto. He talked about the opportunities he sees for Drupal to leverage AI in the short, mid, and long term. 

Short-term

Dries spoke about immediate opportunities to use generative AI to help with content creation. He noted that Drupal already has a few solutions in this area, including the OpenAI module—which allows you to generate content with ChatGPT directly within the Drupal UI, as well as do things like auto generate taxonomy or translate your content into multiple languages. Dries also noted that there may be possibilities to automatically generate images or short videos.

Mid-term

In the medium term, Dries sees AI being used to help people navigate sites and search content. “I believe AI tools can help assist users with finding the information they’re looking for. It can make it more accurate and also more intuitive to find the information.” Dries noted that this is already becoming a reality with enterprise search tools.

Long-term

The longer term “is where it gets really interesting, but also unclear”, says Dries. He believes there is potential for AI to become the new user interface for experience creation. Today, building engaging content still requires a fair amount of knowledge, skills, and clicking. But Dries predicts that in the future, users will be able to tell AI in human language to generate full pages and components in specific styles. 

“It will figure out what modules you need and maybe do the initial configuration. Then a  developer or a side builder can come in and fine tune it, because I'm sure it won't be perfect.” 

“Maybe it's a big vision, but I think that's where we're going. I don't know exactly how we'll get there, but I do think that AI can be the ultimate user experience for a lot of people. That's why I think it's important for us to pay attention.” 

 

 

Psst… want to attend talks like this in person? See our upcoming EvolveDrupal events!

What’s Next for You? 3 Ways to Evolve With Drupal

If you’re using Drupal, you’re already part of a community that can help you leverage innovations—including AI tech—in a thoughtful and effective way. I think there are three ways that we can all proactively innovate with Drupal:

1. Build new things with Drupal

An example is my team’s recent work with McGill University to create Data Homebase, the first web application of its kind in North America. It’s revolutionizing the housing industry by standardizing circular housing data and using data visualization to champion sustainable homebuilding. 

2. Innovate alongside Drupal

Because of its composable nature, we can integrate adjacent innovative tools into our technology stack. We built the Bibliothèque et Archives nationales du Québec (BAnQ) website with Drupal as its CMS because of its ability to seamlessly integrate Acquia Cloud for hosting, Azure AD for employee logins, API-connected Drupal forms, and Google Custom Search. This allows archivists, researchers, and the public to take full advantage of the vast resources of one of Canada’s most prominent cultural hubs. 

3. Improve Drupal itself

Our team regularly contributes to Drupal—for example, we recently helped convert Olivero components into Single Directory Components and proposed enhancements to LayoutBuilder at the EvolveDrupal Toronto summit. We encourage everyone to try their hand at contributing, no matter their skill level or background! Check out the Drupal AI Community Initiative to see what Drupal AI solutions have already been created and what projects are currently underway.

+ more awesome articles by Evolving Web

Metadrop: Drupal Camp Sevilla: how it was?

The Drupal Camp Sevilla has come to an end, and we have all returned home, but the memories and knowledge gained remain with us.

I would like to extend my sincere thanks to all those who made this event possible. This gathering is a product of the dedicated Drupal Community, and the results have been truly outstanding!

Image Image removed.

 

The Business Day

Since many editions ago, the first day of the Drupal Camp has been the Business Day. It is oriented towards meeting other companies and organisations that use Drupal and trying to establish synergies. This time there were almost 30 of us, much more than last year. We discussed the Business Day and Drupal through questions like: Is it worth it? What can we do to improve it and enhance engagement with Drupal? What can we propose? We split into small groups, and I can talk about what we discussed: yes, the…

Drupal Core News: Join strategic and community initiatives at DrupalCon Lille

DrupalCon Lille is in less than three weeks with more than a thousand in-person attendees and leads of various key initiatives there. Meet the leaders of improvements to Drupal core and drupal.org and advance the platform together! Here are some chances to meet them and connect.

Keynotes

The Driesnote on Tuesday at 1:30pm will of course include an update on some key initiatives as well as other exciting insights into where Drupal is headed.

The Drupal Initiative Leads keynote on Thursday at 1:30pm will feature Mike Herchel (New toolbar), Sascha Eggenberger (Admin UI improvements), Chris Wells (Project Browser), Suzanne Dergacheva (Promote Drupal), Felip Manyer Ballester (localize.drupal.org's Drupal 10 port), Fran Garcia-Linares (drupal.org's Gitlab). Meet them after the keynote on Friday to get involved in their respective initiatives!

Drupal's future

There are various sessions beyond the keynotes to discuss where Drupal is headed.

Alejandro Moreno Lopez, Scott Massey, Cristina Chumillas, AmyJune Hineline and Nick Veenhof will discuss Innovation and the future of Drupal on Thursday at 9:15am.

To discuss the Drupal Association's strategy of supporting Drupal's innovation and other questions, join Tim Lehnen, Tim Doyle, Baddy Breidert and Dries Buytaert on Wednesday 4:15pm at Drupal Association Staff + Board + Community Q&A.

More specifically Lauri Eskola, one of Drupal's Product Managers will talk about Making Drupal a Better Out-of-the-Box Product on Thursday at 11:30am.

Adopt new frontend tools

Some of the recent frontend initaitive results are now at a stage where you can adopt them!

Join Mike Herchel on Thursday at 9:15am to learn about Drupal's new Single Directory Components (SDC), which will revolutionize how you create reusable components for your modules and themes.

Andy Blum will dive deep into Drupal 10's recently stabilised starterkit tool on Thursday at 10:30am. In One Theme To Rule Them All: Using StarterKits to accelerate theme development and reduce technical debt he will show you how this new tool changes theme creation for agencies and product team for years to come.

Get involved with admin interface improvements

In Next Drupal admin UI improvements on Thursday at 10:30am, Cristina Chumillas and Sascha Eggenberger will provide an overview of various admin UI improvements in the making.

Christian López Espínola will join Cristina Chumillas to immerse you in the discussion around a new Drupal dashboard specifically in So I logged in, now what? The Dashboard initiative welcomes you on Tuesday at 4:15pm.

Image removed.

Project Browser

Check out the Project Browser Initiative: Where We're At and How You Can Help session on Tuesday at 3:00pm by Leslie Glynn and Chris Wells. Later on find Chris at the contribution rooms most days to collaborate on project browser development. Meet Leslie at mentored contribution events to try Project Browser.

After the session Leslie will also hold a BoF on Tuesday at 4:15pm: Maintainers - prepare your modules to shine in the Project Browser for project maintainers to get the most out of their placement in Project Browser.

Automatic updates

There is a whole panel discussion about Automatic Updates with key contributors Peter Wolanin, Tim Lehnen, Wim Leers and Jess (xjm). What's Next for Drupal Autoupdates on Thursday at 3pm.

Backend improvements

Wim Leers will talk about the work he is coordinating in Drupal's next leap: configuration validation on Thursday at 5:15pm. Configuration validation will not only make decoupled solutions easier to implement, it will also make deployments more consistent and reliable.

David Bekker, Drupal core's database API maintainer will present Why we moved the database drivers to their own modules and what we want to do next on Thursday at 5:40pm.

Drupal.org improvements

DrupalCon Lille offers a wide varierty of content and ways to get involved with improvements to Drupal.org. Drupal.org Update: Accelerating contribution will provide an overview with Tim Lehnen, Neil Drumm, Brendan Blaine and Fran Garcia-Linares on Wednesday at 11:30am.

Fran Garcia-Linares will present a dedicated session api.drupal.org, a journey from Drupal 7 to Drupal 10 on Wednesday at 3:00pm, while Felip Manyer Ballester and Nicolas Loye will delve into Drupal 10 localization server upgrade initiative status on Tuesday at 5:15pm. Felip Manyer Ballester will also hold a BoF discussion on Wednesday at 3pm on Contribution credits for translation activity on localize.drupal.org.

Contribution day

All of the initiatives mentioned in this post will have key folks working on contribution day on Friday (20th of October), while some of them will be present in the contribution rooms on earlier days as well. Join them at their respective tables to be part of Drupal's future!

And even more

These were just some of the top initiatives we are tracking and there will be a lot more things you can learn and be involved with at DrupalCon. Check out the full DrupalCon Lille schedule for sessions and especially BoFs to get involved and make sure to join contribution spaces throughout the event and on Friday.

The Drop Times: Nonprofit Drupal Community: A Haven for Open-Source Enthusiasts

Discover the enduring commitment of nonprofits to open-source solutions within the Drupal ecosystem. Co-organizer Johanna Bates shares insights into the vibrant 'Nonprofit Drupal' community, where professionals converge to discuss the intersection of Drupal and nonprofit work. Explore their journey, monthly chats, and dedication to fostering knowledge-sharing among those passionate about using Drupal for mission-driven endeavors.

Drupal Association blog: Drupal 7 End of Life: The First Steps in your Drupal 7 Migration Process

As you may have heard, Drupal 7 is reaching its End of Life on 5 January 2025 – fourteen years to the day that it was released. While migrating from Drupal 7 may seem daunting, the Drupal Association is here to assure you that the process can be smooth from start to finish! In our series of Drupal 7 End of Life blog posts, throughout the next 15 months we’ll be sharing all of the tips and tricks to ensure you have the easiest migration possible.

As you approach the start of your site migration, you may be asking yourself – where do I even begin? Don’t feel embarrassed if you don’t even know where to start! We know that there are plenty of site owners in the same boat as you, asking themselves where to begin. Luckily, we’re here to help! There are three steps you should complete first in your Drupal 7 End of Life site migration process:

  1. Read our questionnaire to determine your needs
  2. Determine your budget for migration
  3. Select a certified partner and/or use our DIY resources to build out a plan

Not only has the Drupal Association created a questionnaire to assist you in deciding on which direction to take your Drupal 7 sites, but we have also created an entire portal of partners and resources. In order to find the best fit for you, we invite you to take our Drupal 7 Site owner questionnaire here (scroll down, and you’ll find it under ‘Understanding my Options as a Drupal 7 Site Owner’). Once you’ve taken the questionnaire, you can begin working on your budget plan for 2024 and determine if you’ll be working with a certified partner from our list or using DIY resources to migrate

So, when should you begin these steps? We recommend starting as soon as possible, ideally completing these three steps by the end of Quarter 4 of 2023! By determining the best path for your site, planning your budget, and then selecting your certified partner(s) or resources, you’ll be on your way to migrating your site by the End of Life date of 5 January 2025. 

What does End of Life mean for you?

In software terms, end of life means that the version of that software no longer receives feature updates, bug fixes, or security releases. This last point is the most important. If a security vulnerability is discovered after the end of life date, it may be publicly disclosed, and you will be unable to update your site to protect against the issue. For this reason, we recommend beginning to plan your migration now. 

Whether you want to take advantage of new functionalities with Drupal 10 or opt for another option, we’re here to support you.

Learn more on our Drupal 7 End of Life page now, and stay tuned for more blogs in our Drupal 7 End of Life series!

LN Webworks: How to Boost the Performance and Scalability of Drupal Websites

Image removed.

Performance and scalability are two factors that affect the profitability and impact of a website to a great degree. If your site is slow and cannot accommodate increasing traffic, you’ll undoubtedly lose potential customers which will lead to the inhibition of your business’s growth. After all, research indicates that 40% of people abandon a site that doesn’t load within 3 seconds. Given that, a nimble and scalable website is the need of the hour for any business aspiring to succeed in this highly competitive world. Drupal development is a potent way for building such sites as the CMS is pre-equipped to exhibit fast performance and incredible scalability.

However, as all websites are unique and have different requirements, Drupal's performance and scalability depend on multiple components. In this blog, we’ll examine some of these components and delve into how you can utilize them to boost site speed and scalability.