Google and error Noindex Deteted in X-Robots-Tag Http Header: Explanation and Solution


The story begins with one of Reddit users described a situation that many web developers face. Google Search Console reports that it cannot index the page because its indexation is blocked, which is different from locking its scanning. However, the page check did not find any NOINDEX items, nor the scanning lock was found via robots.txt.
GSC shows "Noindex Deteted in X-RBOTS-TAG HTTP Header" for large part of my URL. However: I can not find any NOINDEX in HTML-output, no NOINDEX in Robots.txt, no Noindex visible in testing headings during testing, Live test in GSC shows page as indexed, the site is for Cloudflare
One of the Reddit users suggested checking whether the problem is related to Cloudflare. He provided detailed instructions on how to analyze if Cloudflare or something else Google Index.
John Mueller from Google also expressed his thoughts about the problem. According to him, he saw this error occurred in connection with CDNS (content networks). Interestingly, he also saw how it happened to very old URLs. He did not reveal the details of the latter, but it seems to be hinting at some indexation bug associated with old indexed urls.
- 📌 It is important to understand that the error "Noindex Deteted in X-Robots-Tag HTTP Header" can be caused by various factors, including CDN and even very old URL.
- 📌 You should always thoroughly check the page code, Robots.txt file and answer headlines.
- 📌 When looking for a problem, you can help using different tools such as Googlebot and Rich Results Tester from Google.
Статтю згенеровано з використанням ШІ на основі зазначеного матеріалу, відредаговано та перевірено автором вручну для точності та корисності.
https://www.searchenginejournal.com/google-on-search-console-noindex-detected-errors/540829/