Hello,
I'm trying to use do_walk so I can use the <DEL></DEL> to comment out my header that is set up as an include on my pages. I am entering the following command:
/usr/local/apache/cgi-bin/texis top=http://172.18.21.40/retweb/index.shtml do_walk/dispatch.txt
It starts the index.shtml page and then just finishes with that. Nothing is indexed and I do have the .shtml extension being recognized in the script. It says:
Started 1 (40050) on http://172.18.21.40/retweb/index.shtml
Finished 40050 on http://172.18.21.40/retweb/index.shtml
Updating Metamorph index
When I look at the log I get (this is just a few of the first lines):
http://172.18.21.40/robots.txt
100 do_walk(getrobotstxt) 590: Document not found: http://172.18.21.40/robots.txt returned code 404 (Not Found)
http://172.18.21.40/retweb/index.shtml
100 do_walk(procpage) 415: More Values Than Fields in the function Insert while processing url http://172.18.21.40/retweb/index.shtml
http://172.18.21.40/retoffice/applicati ... owto.shtml
It seems to be crawling the links because it visits the PDF's and all the .shtml files, it just doesn't index them. And the "More Values Than Fields in the function Insert while processing url" error seems to be on all the pages.
I saw in one posting you suggested someone look at what url's were in the database. I did that and got:
115 Field url non-existent
115 Field non-existent or type error in url
000 SQL Prepare() failed with -1
Is it something with the .shtml?
Thanx,
Justin
I'm trying to use do_walk so I can use the <DEL></DEL> to comment out my header that is set up as an include on my pages. I am entering the following command:
/usr/local/apache/cgi-bin/texis top=http://172.18.21.40/retweb/index.shtml do_walk/dispatch.txt
It starts the index.shtml page and then just finishes with that. Nothing is indexed and I do have the .shtml extension being recognized in the script. It says:
Started 1 (40050) on http://172.18.21.40/retweb/index.shtml
Finished 40050 on http://172.18.21.40/retweb/index.shtml
Updating Metamorph index
When I look at the log I get (this is just a few of the first lines):
http://172.18.21.40/robots.txt
100 do_walk(getrobotstxt) 590: Document not found: http://172.18.21.40/robots.txt returned code 404 (Not Found)
http://172.18.21.40/retweb/index.shtml
100 do_walk(procpage) 415: More Values Than Fields in the function Insert while processing url http://172.18.21.40/retweb/index.shtml
http://172.18.21.40/retoffice/applicati ... owto.shtml
It seems to be crawling the links because it visits the PDF's and all the .shtml files, it just doesn't index them. And the "More Values Than Fields in the function Insert while processing url" error seems to be on all the pages.
I saw in one posting you suggested someone look at what url's were in the database. I did that and got:
115 Field url non-existent
115 Field non-existent or type error in url
000 SQL Prepare() failed with -1
Is it something with the .shtml?
Thanx,
Justin