The Problem
So, there are three files on my website that aren't being indexed as they should because, apparently, they are being blocked by my robots.txt
file.
According to Googlebot, lines eleven, twelve, and thirteen are the ones causing the trouble:
However, this behavior is confusing because, as according to the Order of precedence for rules section of How Google Interprets the robots.txt Specification, Google chooses the most specific and least restrictive rule.
To me, Allow: /resources/furries/faq.html
is more specific (and less restrictive) than Disallow: /resources/furries/
.
The Desired Effect
The intended behavior is that with Disallow: /resources/furries/
everything becomes blocked, including the directory and index.html
, along with the files it contains, except everything I explicitly list, faq.html
, index.html
, etc... which are allowed to be indexed.
Unfortunately, I, seemingly, do not know how to get this behavior.
Cheers!
Top comments (0)