Metadrop: Optimizing Drupal Performance - Internal Page Cache

The Internal Page Cache is a core module in Drupal responsible for caching pages requested by anonymous users.

When a page is cached and an anonymous user makes a new request, Drupal does not need to perform any rendering or page-building processes. It simply retrieves the rendered page from the cache and sends it to the client.

The reason it only applies to anonymous users and not authenticated users is that the page returned to the client must have exactly the same content for all users.

In the case of authenticated users, although part of the content may be the same for everyone, there are always elements that can vary, such as the user block displaying the user's name or other user-specific information.

For these cases, there is the Dynamic Page Cache module, which handles caching for both anonymous and authenticated users.

Functionality

Cache Bin

For storing and managing cached pages, the Internal Page Cache defines its own cache bin called “Page,” meaning that cached objects are stored independently of other existing cache systems in Drupal.

Drupal Association blog: The Drupal Association Announces 2024 Board Election Winner and 2 Additional New Board Members 

The Drupal Association is saying goodbye to two board members and welcoming three new board members who will join the Drupal Association Board.

The Drupal Association extends a sincere thank you to Nikki Flores and Nick Veenhof for their service and dedication, not only to Drupal, but to the Drupal community. Thank you for everything you both have done while on the Drupal Association Board! Your time spent on the board made such a difference to the future of the Drupal project, and we thank you all for participating with grace, thoughtfulness, and insightful contributions.

The Drupal Association would now like to congratulate our newest board members, officially announced during the recent public board meeting at DrupalCon Barcelona:

Image removed.

Sachiko Muto

 Image removed.

Stella Power

Image removed.

Alejandro Moreno

An additional congratulations to Alejandro Moreno for winning the community-elected seat during our 2024 At-Large Board Election! We cannot wait to see what amazing things Alejandro will accomplish while on the Drupal Association Board. We invite you to get to know Alejandro and learn more about his background.

I'm deeply honored to have been elected to the Drupal Association Board. Thank you to everyone for your trust and support—I look forward to serving our incredible community! - Alejandro Moreno

We extend our gratitude to all the candidates who participated in the 2024 election. On behalf of all the staff and board of the Drupal Association, a heartfelt Drupal Thanks to all of you who stood for the election this year. It truly is a big commitment to contribution, the Drupal Association, and the community, and we are so grateful for all of your voices. Thank you for your willingness to serve, and we hope you’ll consider participating again in 2025!

Detailed Voting Results

There were 7 candidates in this year’s At-Large board member election.

707 voters cast their ballots out of a pool of 2741 eligible voters.

Under Approval Voting, each voter can give a vote to one or more candidates. The final total of votes was as follows.

Candidate

Votes

Albert Hughes 197 Will Huggins 154 Alejandro Moreno 282 Janna Malikova 234 Kevin Quillen 189 Matthew Saunders 188 Dominique De Cooman 208

1xINTERNET blog: Community survey analysis and revised implementation of search in Drupal CMS (formerly Starshot)

As track leads for search in Drupal CMS (formerly Starshot), we developed a concept and aligned it with the other leaders. To refine our approach, we conducted a survey to gather feedback and identify potential improvements. This article presents the survey results and the planned implementation.

Drupal Association blog: Introducing the Revolutionary Drupal CMS at DrupalCon Barcelona: A New Era in Drupal with AI-Enabled Website Building

BARCELONA, Spain, 25 September 2024 — This September, the digital world witnessed a groundbreaking moment at DrupalCon Europe in Barcelona ( 23-27 September), as Dries Buytaert, the visionary founder of Drupal, unveils a preview of Drupal CMS in his landmark 40th Driesnote address. This launch isn't just an announcement; We're standing at the threshold of a new era in the Drupal story, marking the most significant evolution since its birth in 2001.

Drupal CMS leverages existing enterprise-grade security and scalability paired with the adoption of AI abilities and revolutionizing user experience with Experience Builder, all wrapped in an easy-to-use interface that any skill level can use.

“The product strategy is for Drupal CMS to be the gold standard for no-code website building,” said Dries Buytaert, Drupal Founder and Project Lead. “Our goal is to empower non-technical users like digital marketers, content creators, and site-builders to create exceptional digital experiences without requiring developers… We are not abandoning Drupal Core, developers, or enterprise users. Drupal CMS is built on Drupal Core, so we need to make sure Drupal Core does well. The way to think about Drupal CMS is a product or strategy that compliments what we already do.”

Drupal CMS: The Future Unveils in Early 2025

Imagine a world where the Enterprise-class power of Drupal is not just for the technically adept but is accessible to everyone. This is the reality Drupal CMS brings to the table. Tailor-made to empower marketers, web designers, and organizations, the preconfigured options are the key to effortlessly crafting and managing websites that stand out. 

Why Drupal CMS is a Game-Changer  
  • Smart Defaults and Pre-Built Solutions: Out-of-the-box readiness meets customizable solutions—whether for corporate sites, blogs, or marketing platforms, enriched with advanced site-building capabilities. 

  • Seamless Onboarding: The intuitive install process is like having a guide, helping you cherry-pick the features your project needs.

  • AI-Powered Efficiency: With cutting-edge AI tools such as A.I. site migration and alt text creation, site building accelerates as you edit content types, set up fields, craft taxonomies, and more—with AI agents transparently handling various website actions.

  • Advanced SEO: Site builders can easily use powerful SEO features such as sitemaps, focus keywords, metatags, and more with minimal effort

  • Open-Source Power: Drupal CMS thrives on open-source collaboration, harnessing the collective innovation of a global community. Fully customizable and endless extensions prevent vendor lock-in.

Ready to Dive In? 

Curious minds can explore a demo in the Driesnote today —experience the future of web building firsthand. And stay tuned: the official launch is set for 15 January 2025, with even more innovations on the horizon.

A Responsible Approach to AI  

In line with Drupal's open web manifesto, a responsible AI policy ensures every AI feature in Drupal CMS is crafted with ethical considerations at its core—because innovation should go hand-in-hand with integrity. 

Join us on this exhilarating journey as we step into the future with Drupal CMS—where imagination meets innovation, and building websites will never be the same.

For a closer look at Dries's presentation and to explore Drupal CMS, watch his full Driesnote.

About DrupalCon

DrupalCon is the flagship event for the global Drupal community, bringing together thousands of developers, designers, marketers, and business leaders to connect, learn, and share knowledge about Drupal. Held in various cities around the world, DrupalCon features keynotes, workshops, and networking opportunities that highlight the latest innovations in the Drupal ecosystem.

About Dries Buytaert

Belgium-born Drupal founder Dries Buytaert is a pioneer in the Open Source web publishing and digital experience platform space. He is the Founder of Drupal, the Open Source software for building websites and digital experiences. Drupal is one of the largest and most active Open Source projects in the world. Dries has been working on Drupal for more than two decades and continues to lead the project today as Drupal's Project Lead.

Specbee: Why You Can’t Ignore Google Crawlability, Indexability & Mobile-First Indexing in SEO

Do you ever feel like you’re living ahead of your time when you watch those world-conquering AI movies? The only reason such movies are gaining popularity with every passing day is because a similar reality is not too far away. The world today is living wire-free and digitally. Online is the popular and future-demanding way of living. And for your business to prosper in this online world, mastering SEO is a necessity! Now, to achieve optimal visibility, you need to understand the three most important concepts: crawlability, indexability, and mobile-first indexing. Why all of them? These three concepts are codependent, and they make the most crucial pillars to determining how search engines discover, understand, and rank your website. Your goal here is to make your website stand out in this ever-competitive online space; your tool: SEO. In this blog, we focus on the what and why of all three concepts and the best practices to help your website rank higher on Google search results. Introduction to Crawlability, Indexability, and Mobile-First Indexing Each of these three concepts is connected to the other and therefore, can get confusing for the masses to understand the difference without proper analysis. Crawlability refers to the process search engines like Google use to discover web pages easily. In simple words, Google discovers web pages by crawling, using computer programs called web crawlers or bots. Indexing is the process that follows crawling. Indexability refers to the process where search engines can add pages to their index. Google indexes by analyzing pages and their content to add them to the database of zillions of pages. Mobile-first indexing refers to Google’s crawling process as it prioritizes the mobile version of website content over its website version to inform higher rankings. Why These Concepts Matter in Today’s SEO Landscape With the continuous evolution of Google and other search engines, your SEO strategies need to be top-notch. With Google prioritizing mobile traffic, and hence, its mobile-first initiative, optimizing your website for both crawlability and indexability is the need of the hour. Without proper measures implemented in these areas, your website may be heading towards stunted organic growth. And you don’t wish for that to happen, do you? So, go on reading till the end. What is Crawlability in SEO? Performed by crawler bots, crawlability is the ability of search engines to find and index web content. Google crawl bots are essential computer programs that analyze websites and collect their respective information. This information helps Google index websites and show relevant search results. Google crawl bots follow a list of web page URLs based on previous crawls and sitemaps. As crawl bots visit a page, they detect links on each page and add those links to the list of pages to crawl. These new sites then become a part of SERP results and change to existing sites. Furthermore, dead links are spotted and taken in to update the Google index. However,  it’s important to know how to get Google to crawl your site. If your website is not crawlable, chances are, some of your web pages won’t be indexed, leading to a negative impact on your website’s visibility and organic traffic, thus, affecting your site’s SEO numbers. Key Factors Affecting Crawlability While crawlability is crucial for optimizing your site’s visibility, there are several factors impacting it. Learn about some of the most crucial ones are below: Site structure – Your site structure plays a pivotal role in its crawlability. If any of your pages aren’t linked to any source, it won’t be easily accessible to crawl bots for crawling. Internal links – Web crawlers follow links while crawling your site. These crawl bots only find pages linked to other content. To get Google to crawl your site, make sure to have internal linking structures across your website, allowing crawl bots to easily spot these pages in your site structure. Robots.txt and Meta Tags – These two parameters are crucial to maximizing your crawl budget. Make sure to avoid commands that prevent crawlers from crawling your pages. Check your meta tags and robots.txt file to avoid common errors:       ◦ <meta name="robots" content="noindex"/>: This robots meta tag directive can block your page from indexing.       ◦ <meta name="robots" content="nofollow">: Although nofollow links allow your pages to be crawled, it exudes all permissions for crawlers to follow links on your page.       ◦ User-agent: * Disallow: /: This particular code bars all your pages from getting indexed. Crawl Budget – Google offers limited recommendations for crawl budget optimizations since Google crawl bots finish the job without reaching their limit. But B2B and e-commerce sites with several hundreds of landing pages can be at the risk of running out of their crawl budget, missing the chance of your web pages making their way to Google’s index for crawlability. Tools to Check and Improve Crawlability There are various tools in the digital realm that claim to improve crawlability. Let me highlight a few most popular tools you can use to improve crawlability: Google Search Console What better tool to analyze Google crawlability than its in-house tool? Although it’s not a surprise to know how often does Google crawl a site, i.e., once every 3 days to 4 weeks, there are no fixed times.  How to fix crawl errors in Google Search Console? You can analyze your page traffic, keyword data, page traffic broken down by device, and page rankings in SERP results. Google Search Console also provides you with insights on how to rank higher on Google search results following Google algorithms. Submit sitemaps and provide comprehensive crawl, index, and related information directly from the Google index. Screaming Frog SEO Spider This is the tool you want to use to take charge of two essential jobs at once: crawl your website and assist with SEO audits. This web crawler helps increase the chances of your web content appearing in search engine results. Analyze page titles, and meta descriptions, and discover duplicate pages – all the actions that impact your site’s SEO. The free version can crawl up to 500 pages on a website and can find if any pages are redirected or if your site has the dreaded 404 errors. If you need more detailed features, the paid version brings many to the table, including JavaScript rendering, Google Analytics integration, custom source code search, and more. XML Sitemaps A roadmap for search engine crawlers, XML sitemaps ensure crawlers reach the important pages that need to be crawled and indexed. For better crawlability, ensure that the following requirements are met while creating sitemaps: Choose Canonical URLs: Identify and include only your preferred URLs (canonical URLs) in the sitemap to avoid duplicates from the same content accessible via multiple URLs. CMS-generated Sitemaps: Most modern CMS platforms can automatically generate sitemaps. For platforms like Drupal, you get to apply the benefits of the Drupal XML sitemap module that allows you to create XML sitemaps adhering to sitemaps.org guidelines. Check your CMS documentation for specific instructions on sitemap creation. Manual Sitemaps for Small Sites: For websites with fewer than a few dozen URLs, create a sitemap manually using a text editor and follow standard sitemap syntax. Automatic Sitemap Generation for Large Sites: For larger websites, use tools or your website’s database to automatically generate sitemaps. Make sure to split your sitemap into smaller files if necessary.Submit Sitemap to Google: Submit the sitemap using Google Search Console or the Search Console API. You can also include the sitemap URL in your robots.txt file for easy discovery. Bear in mind that submitting a sitemap doesn’t ensure Google will crawl every URL. It serves as a hint to Google but does not guarantee URL crawling or indexing. What is Google Indexing in SEO? The process of adding pages of a website to a search engine’s database to display them in search results when relevant is called indexing. Search engines like Google, crawl websites and follow links on the sites to add new pages. They analyze their content to determine its quality and relevance. When search engines find quality and relevant content, they add those pages to their index. One of the main processes of the Google search engine is indexing. Google Indexing allows it to discover and analyze web content and display them as search results to users, resulting in higher SEO ranks. Factors That Influence Indexability Given the importance of indexability for search engines to work properly, it is only understood that several factors can hinder or contribute to its influence. Below is a list of the common factors that influence the indexability of website pages: Content Quality and Relevance - Search engine bots or bot crawlers are fonder of high-quality content. Well-written, original, informative, and content with clear headings and organized structure that is relevant to users make it to the list of indexed pages in search engines. Canonical Tags and Duplicate Content - Search engines cannot index duplicate content. Duplicate content raises confusion during Google page indexing. Therefore, canonical tags are the counter activity here to solve duplicate content issues. These tags help search engines distinguish between original and duplicate content while indexing the relevant content pages. Technical Issues – Technical issues like broken links, page load times, noindex tags or directives, redirect loops, etc., can contribute to a negative impact on SEO, preventing crawling or indexing your site content. Using Google Search Console for Indexing Your Site Various tools allow you to check whether or not your web pages are crawlable or indexed. However, Google being a giant search engine, it’s highly recommended to use Google indexing service to meet your SEO goals relatively easily. Let me walk you through a brief guide as to how you can use Google Search Console request indexing for your website. Verify Website Ownership: Confirm ownership by adding an HTML file or meta tag. Submit Your Sitemap: Submit your sitemap URL in the Sitemaps section to help Google discover your significant pages faster. Use the URL Inspection Tool: Submit specific pages for re-crawling or direct indexing requests. Monitor Coverage Issues: Use the "Coverage" report to identify and fix indexing issues like blocked pages. Improve Internal Linking: Ensure proper internal links for better crawling. Enhance Content Quality: Avoid low-quality or duplicate content to improve indexing chances. Fixing Common Google Indexing Issues Here are some common Google indexing issues (similar and in addition to those of crawlability issues) you may come across on your website that affect its visibility and performance in SERP results: Low-Quality Content Google always targets original and relevant content to rank highest in user search results. If your web content is low quality, scraped, or includes keyword stuffing, chances are, that Google won’t index your web pages. Make sure to cater to the needs of your users and provide relevant information to them that is unique and aligned with the Webmaster Guidelines. Not Mobile-Friendly Google prioritizes mobile-friendliness while crawling web pages. If your website content isn’t mobile-friendly, Google is likely to not index it. Make sure to implement an adaptable design, improve load times, compress images, get rid of pop-ups, and keep finder reach in mind while curating your website. Missing Sitemap Without proper sitemaps, you cannot get Google to crawl your site. Make sure to use XML sitemaps instead of HTML. It helps enhance your website performance in search engines. Once you’ve created the sitemap, submit it with the help of Google Search Console or robots.txt file. Poor Site Structure Websites with poor navigability are often missed by crawl bots. Poor site structure negatively impacts your site crawlability and indexability. Make sure to use a clear website structure and good internal linking for SEO purposes. Redirect Loops Redirect loops block Google crawl bots from indexing your pages correctly, as these bots indulge in these loops and cannot crawl your website. Check your .htaccess or HTML sources of your website for unintentional or incorrect redirect loops. 301 redirects for permanent pages and 302 redirects for temporary pages should be used to prevent this issue from occurring. What is the difference between crawlability and indexability Since these two concepts are connected, oftentimes people mistake one for the other.Crawling is the process of search engines accessing and reading web pages. On the contrary, indexing is the process of adding pages to Google index. Indexing comes after crawling. However, in selective cases, indexing website on Google or other search engine bots can take place without reading the content. What is Mobile-First Indexing As the name suggests, mobile-first indexing means Google crawl bots prioritize indexing the mobile version of website content over desktop. This Google mobile-first indexing is meant to inform rankings. If your website content is not optimized for mobile devices, it could hamper your SEO ranks, even for desktop users. Having mobile-friendly content gives your website the edge for getting recognized for relevance and accessibility, thus improving visibility in search results. Mobile-first indexing is highly prioritized today to meet the expectations of the growing number of mobile users. For a seamless mobile browsing experience, it is crucial to have relevant, navigable content to reduce bounce rates and improve the overall SEO performance of your website. The Impact of Mobile-First Indexing on SEO Mobile-first indexing is a necessity today. It highly impacts the implications of crawlability and indexability, necessary for SEO purposes. Optimize your website for mobile devices using an adaptable design. You can develop your website specially for mobile devices using a responsive web design with a mobile-friendly theme or a completely separate mobile website. Alternatively, you can develop app-based solutions or AMP pages to enhance your website performance on mobile devices. Small businesses, e-commerce and B2B websites, and enterprise-level organizations stand highly affected by this shift to mobile-first indexing. It’s high time for these organizations to develop mobile-friendly websites to meet their customers’ expectations. Additionally, SEO professionals have witnessed a surge in demand for mobile-responsive website design and development services. Therefore, aligning your SEO strategies to Google’s algorithm to stand alongside your competitors in this mobile-first landscape is key. Are mobile-first indexing and mobile usability the same? Definitely not. A website may or may not be usable on mobile devices, despite containing content that calls for mobile-first indexing. Best Practices for Optimizing Crawlability and Indexability in a Mobile-First World In a mobile-first world, it is important to optimize your web pages in the direction of gaining greater visibility and better performance in search engine results. Below are a few best practices supported by Google and other search engines that can help improve the crawlability and indexability of your website for Google mobile-first indexing as a priority in the present scenario. Mobile-friendliness: Effective responsive designs serving the same HTML code on the same URL, irrespective of the device used, is key to optimizing mobile-friendliness. Your site should adapt to the screen size automatically. According to Google, adaptable web designs work best for ease and maintenance. Navigation: Mobile users often use one hand to operate their devices and therefore, make sure to design your website with liquid navigation. Implement larger buttons, and menus, add ample spaces, and additional interactive elements for easy finger-friendly navigation. A pro tip: maintain a minimum tap target size of 48 x 48 px. Content Consistency: Use the same meta robot tags for the desktop and mobile versions of your website. Ensure that your robot.txt file has not barred access to the main content, specifically on mobile devices. Google crawl bots can crawl and index content only when mobile-first indexing is enabled on your website. Page Speed Optimization: Avoid lazy loading of web content. Slow loading times hamper user experience for mobile users. Make sure to optimize your web images without compromising quality. Minimize HTTP requests and leverage browser caching. Ensure faster loading times for your web content across the globe with the help of CDN distribution. Tools and Techniques for Ongoing Optimization With mobile-first indexing in mind, it only makes sense to stay updated with the ongoing trends to check your site for mobile-friendliness. Here are a couple of ways to do the necessary: Use Mobile-Friendly Test Tools with Google Search Console Open your account in Google Search Console and click on “Mobile-Friendly Test” in the Enhancements sections. Input the URL to be analyzed for mobile-friendliness and proceed with the “Test URL” button. Review the report generated that presents you with the result of whether or not the page is mobile-friendly and also offers suggestions to address any issues found. If not mobile-friendly, the list of issues may include small text, appearance issues, content too wide for the size of the screen, or other similar and/or related usability problems. Once you’ve addressed these issues, re-test the page using the Mobile-Friendly Test tool. Run regular audits like this on Google Search Console to ensure the mobile-friendliness of pages across your site. Optimize Core Web Vitals Pay attention to page speed, interactivity, and visual stability to align with Google’s Core Web Vitals. One of the ways to go about it is by utilizing the PageSpeed Insights tool by Google. It allows you to examine web pages and provides data based on both mobile and desktop content versions, along with improvement suggestions. You can simply input the URL in the tool and click on “Analyze.” Proceed from here accordingly to generate a report that checks your website’s mobile-friendliness with a Core Web Vitals Assessment. Higher metrics mean greater mobile responsiveness! Common Pitfalls to Avoid in Mobile-First Indexing Ignoring the Mobile Version of Your Site: Make sure all content, inclusive of text and media, is accessible not only on the desktop version of your site but also on mobile devices. Maintain content parity and SEO rankings. Overlooking Technical SEO on Mobile: Mobile content also needs indexing for SEO purposes. Ensure that both your mobile and desktop sites use the same structured data for consistency. Doing so helps search engines properly analyze your web content. Slow Mobile Page Speed: Compress images, leverage browser caching, and minimize code to enhance the page loading speed for your mobile site. This way, you deliver a seamless user experience for mobile users. Final thoughts Crawlability and indexability are two significant entities that shape the SEO performance of your website. With Google prioritizing mobile-first indexing today, it’s time to roll up your sleeves and strategize a brand-new mobile-focused SEO strategy that aligns with Google’s algorithm and highlights your brand identity. With AI and machine learning increasingly shaping SEO today, optimizing for mobile devices isn’t just about adapting to algorithms—it’s about meeting user expectations and staying competitive. A mobile-friendly site with fast loading times, responsive design, and consistent content will improve search rankings and user experience. As mobile usage grows, embracing mobile-first strategies ensures businesses stay ahead, enhancing both visibility and customer satisfaction in the evolving digital landscape.

Metadrop: Improving headers in a Drupal site using Dries' HTTP Header Analyzer

In a previous article I wrote about the importance of the HTTP headers and web security avoiding the technical stuff. In this article I want to get into all the dirty technical details. 

Some time ago we came across the HTTP Header Analyzer by Dries Buytaert. This tool, as the name suggests, analyses the headers of the HTTP response from a website. There are other header analysers out there, but this one, published by the creator of Drupal, also takes into account Drupal-specific headers. The tool displays a report of all headers found, along with an explanation of the purpose of the header and notes on the values of the header, sometimes including recommendations for better values. It also displays information about missing headers that should be present. 

With all this information, the tool gives a score based on the missing headers, warnings and notices detected during the analysis. On the first runs on our website, I have to admit that the score was not bad, but not good: 6/10. Now I am happy to say that we get a score of 10/10 on the home page. Unfortunately, not all pages can get the highest score, as I explain below.

The Drop Times: Drupal CMS Expected by 15 Jan, XB Further Away in 2025: A Quick and Dirty Summary of Driesnote

Get ready to witness a transformative era for Drupal! In his 40th State of Drupal address, founder Dries Buytaert announced groundbreaking plans that are set to redefine the platform. With the targeted launch of Drupal CMS 1.0 on January 15, 2025, coinciding with Drupal's 25th anniversary, the platform is gearing up to become the gold standard for no-code website building. From the ambitious Starshot Project aiming to revolutionize site-building for non-developers, to the introduction of the Experience Builder (XB) that brings React into the fold, Drupal is embracing innovation like never before. Plus, with a strong commitment to responsible AI integration and a complete overhaul of its documentation, Drupal is positioning itself at the forefront of web development. Dive into how these exciting developments will shape the future of Drupal and what they mean for the global community!