When I spider with gw or dowalk, is it intelligent enough to follow pages that are navigated to via javascript. For example, one of the pages has a select option list and it calls a js function that does a form submit whenever an option is selected. Will the spider catch these pages?
No; nor will anyone with a browser with Javascript disabled. All links must be in static HTML for Webinator to find them, perhaps in a <NOSCRIPT> block for non-script browsers too.
Commented sections don't exist as far as text and url extraction are concerned. The empty text will work though.
Are you really that adverse to allowing users without javascript to use your site? When I work from a modem I have all of that stuff, java, javascript, flash, etc. turned off.
I hear ya. but I'm just a small (contracted) cog in a big machine. Thanks for your help. i really appreciate it. I think we'll go with the empty text links.