Can I use the 'Depth' parameter to limit the crawl to a subdirectory of a site?
I want it to crawl www.mysite.com/asubdirectory so that all links on pages in that directory will be checked esp. for broken links. However, I don't want it to read pages in www.mysite.com/anothersubdirectory. At the moment it seems to be limiting itself to mysite.com but is escaping to other sub-directories.
Thanks. However, doesn't this cause all pages without the specified prefix url to be ignored? If I have a link to www.aseperatebrokensite.com this link will be ignored and therefore I will not be told it is broken, right?
Not quite. The directory restriction only applies to that site. gw won't try offsite urls anyhow unless you use -o. Maybe you want
gw -d- -D1 -o http://www.mysite.com/asubdirectory/