I updated robots.txt today, (and confirmed the updates by displaying robots.txt in the browser) then did a new Webinator walk, but files that are in the excluded directories are still showing up in the index.
Note: the files linking to the 'excluded' files are not in the excluded directories (e.g., /indexeddir/file.asp is calling /excludeddir/dontindexme.pdf) - is that the problem? Do I need to be adding the excluded directories to the Exclusions list?
Note: the files linking to the 'excluded' files are not in the excluded directories (e.g., /indexeddir/file.asp is calling /excludeddir/dontindexme.pdf) - is that the problem? Do I need to be adding the excluded directories to the Exclusions list?