Optimizing EDS SEO score | Community
Skip to main content
New Participant
November 26, 2024
Solved

Optimizing EDS SEO score

  • November 26, 2024
  • 3 replies
  • 802 views

Hey guys,

I recently evaluated the Lighthouse SEO scores for my published EDS (Edge Delivery Service) website and noticed that the SEO score was quite low. To better understand the issue, I also checked the boilerplate SEO score, and it appears to be low as well.

I’ve already ensured that metadata is included for each index page, but it seems that web crawlers might be disabled. Has anyone else encountered similar challenges with SEO for EDS sites? If so, I’d appreciate any insights or suggestions on how to address this and improve the SEO performance.

Thanks in advance for your help!



Thanks

Best answer by EstebanBustamante

Hi, 

The issue is with the robots.txt file. You need to add your own file to overwrite the out-of-the-box file that is blocking the page from being crawled. Please check this.

References: https://experienceleague.adobe.com/en/docs/experience-manager-cloud-service/content/edge-delivery/build/anatomy-of-a-franklin-project#tame-the-bots-robotstxt

 

Hope this helps.

 

3 replies

New Participant
March 23, 2025

Your website is currently marked with "nofollow" or "noindex" tags, which prevent Google and other search engines from crawling and displaying it in search results (SERPs).

 

This often happens during the development phase of a new website. Make sure to remove the "nofollow" or "noindex" tags before launching the final version to ensure it can be properly indexed.

kautuk_sahni
Employee
December 16, 2024

@osmanwa Did you find the suggestion helpful? Please let us know if you require more information. Otherwise, please mark the answer as correct for posterity. If you've discovered a solution yourself, we would appreciate it if you could share it with the community. Thank you!

Kautuk Sahni
EstebanBustamante
EstebanBustamanteAccepted solution
New Participant
November 27, 2024

Hi, 

The issue is with the robots.txt file. You need to add your own file to overwrite the out-of-the-box file that is blocking the page from being crawled. Please check this.

References: https://experienceleague.adobe.com/en/docs/experience-manager-cloud-service/content/edge-delivery/build/anatomy-of-a-franklin-project#tame-the-bots-robotstxt

 

Hope this helps.

 

Esteban Bustamante