Intranet vs Internet access

Post Reply
User avatar
Thunderstone
Site Admin
Posts: 2504
Joined: Wed Jun 07, 2000 6:20 pm

Intranet vs Internet access

Post by Thunderstone »



If we implement Webinator on our site we will need to restrict some
pages to our Intranet while most are available over the Internet.
How have other people solved this problem ?

I have listed my ideas so far below; also can anyone tell me if
number 3 would work. It might be practical but Im not sure of the
number of restricted files I'm working with yet.

1. Have 2 indexes.
Pro: simple to do.
Con: 2 Indexes, 2 Rewalks, will use a lot of time and disk space we
are a pretty big site. ~1Gb of html files.

2. Dissable the view matched info, so details aren't visible and the
actual page is retricted to intranet.
Pro: easy to do, 1 Index , 1 rewalk
Con: No match info for unrestricted files, page existence still
revealed.

3. Possibility of using a kind of meta robots tag?
Restricted files include a meta tag with contents set to
"ExcludeExternal" for example, then using a separate search page
for external visitors exclude any results that contain this tag in the
meta field of the html table.
Pro: 1 index, 1 rewalk.
Con: All restricted pages have to be modified to include extra tag.




Dr Chris Barran
Special Projects
Corporate Information & Computing Services
University of Sheffield



User avatar
Thunderstone
Site Admin
Posts: 2504
Joined: Wed Jun 07, 2000 6:20 pm

Intranet vs Internet access

Post by Thunderstone »



With our site, we have two main directories: public and internal. I check
the contents of a specific environment variable and restrict the search as
appropriate with a _matches_ clause.
This is also how we take care of specific collections such as Tech Support.

Tim Rosine
tim.rosine@infores.com

-----Original Message-----
From: webinator@thunderstone.com
Sent: Monday, March 01, 1999 11:17 AM
To: Tim Rosine
Subject: Intranet vs Internet access

If we implement Webinator on our site we will need to restrict some
pages to our Intranet while most are available over the Internet.
How have other people solved this problem ?

I have listed my ideas so far below; also can anyone tell me if
number 3 would work. It might be practical but Im not sure of the
number of restricted files I'm working with yet.

1. Have 2 indexes.
Pro: simple to do.
Con: 2 Indexes, 2 Rewalks, will use a lot of time and disk space we
are a pretty big site. ~1Gb of html files.

2. Dissable the view matched info, so details aren't visible and the
actual page is retricted to intranet.
Pro: easy to do, 1 Index , 1 rewalk
Con: No match info for unrestricted files, page existence still
revealed.

3. Possibility of using a kind of meta robots tag?
Restricted files include a meta tag with contents set to
"ExcludeExternal" for example, then using a separate search page
for external visitors exclude any results that contain this tag in the
meta field of the html table.
Pro: 1 index, 1 rewalk.
Con: All restricted pages have to be modified to include extra tag.




Dr Chris Barran
Special Projects
Corporate Information & Computing Services
University of Sheffield




Post Reply