Google Has Made Four Changes to its Index Coverage Report
Google Search Console's Index Coverage report will get four updates aimed keeping site owners more in the loop over any indexing issues. The Index Coverage report was first introduced in 2018. The changes now rolling out are based on the feedback submitted by the webmaster community:
"Based on the feedback we got from the community, today we are rolling out significant improvements to this report so you're better informed on issues that might prevent Google from crawling and indexing your pages. The change is focused on providing a more accurate state to existing issues, which should help you solve them more easily." |
- Removal of the generic "crawl anomaly" issue type - all crawl errors should now be mapped to an issue with a finer resolution.
- Pages that were submitted but blocked by robots.txt and got indexed are now reported as "indexed but blocked" (warning) instead of "submitted but blocked"
- The addition of a new issue: "indexed without content"
- Soft 404 reporting is now more accurate
The primary theme of the updates revolves around data accuracy, and site owners will have a more definite idea about problems that occur, such as crawl errors, pages blocked by robots.txt, and soft 404 reporting is going to be more accurate - plus, there's a new "indexed without content" feature. The Search Console Help page has this to say regarding indexed without content:
"This page appears in the Google index, but for some reason, Google could not read the content. Possible reasons are that the page might be cloaked to Google or the page might be in a format that Google can't index. This is not a case of robots.txt blocking." |