JavaScript product page problems



In this episode of SEO Fairy Tales, Martin Spiltt and Jamie Indigo, a Senior Technical SEO at Lumar (formerly DeepCrawl), chat in-depth about a Javascript technical SEO problem, specifically how 3 million product page listings got lost from Google’s index. Find out how Jamie solved her client’s problem step by step using Search Console’s URL inspection tool, robots.txt tester, and Chrome Developer Tools.

Chapters
0:00- Introduction
1:23 – Product page problems
2:19 – Starting the investigation
3:04 – Searching for the soft 404 source
4:23 – Inspect URL clues
5:20 – CDN caching
6:31 – API calls breaking
7:25 – Block and load
8:46 – Fixing the issue
10:11 – Wrap up

Watch more episodes of SEO Fairy Tales → https://goo.gle/SEOFairyTales
Subscribe to Google Search Central Channel → https://goo.gle/SearchCentral

#SEO #JavaScript

source

8 thoughts on “JavaScript product page problems”

  1. From what I can gather, improper use of robots.txt was blocking some scripts that were used to populate product pages, causing them to soft 404. Caching of the robots.txt made QA’ing the fix more difficult. Lesson: stop using robots.txt for crawler control, as it’s almost always a poor solution, and use noindex instead.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top