There are several ways to walk a site that uses cookies, and the Search Appliance should work with cookies that are set by the site on visited pages. If a login or similar cookie needs to be set the preferred method is to use the Primer URL option, which specifies a URL to visit before crawling any other pages to get a cookie set. It can include login parameters if needed, and supports http-post URLs if a POST instead of GET is required.
The primer is by far the preferred solution. But if that method just can't work for some reason you would have to share your folder with the cookie file or put the cookie file on a network share. Then mount that share to the appliance using Maintenance->Network filesystems and shares. The path to the cookie file would then be of the form /mnt/local/HOST/SHARE/PATH_TO_COOKIE_FILE . Make sure that share is always working and the file always present when a walk runs.
The primer url does not work because when i connect to the site and i don't have the cookie, the site sends an e-mail with an url that will generate the cookie. Unfortunately this url expires after 48 hours.
So do i fill in the primer url with the cookie url do a 'new' walk, clean out the primer url and do 'refresh' walks after that?
I don't know what you mean by the cookie being "refreshed", but it does not work.
Normally the generated cookie has a life time of 10 years, but i need a way to get the appliance to remember it. The link to set up the cookie unfortunately expires after 48 hours.
I'll have a try at the second method, although it does seem to be a bit unsecure to put a cookie on a network share.