Fetch Parallel URL's

Post Reply
gaurav.shetti
Posts: 119
Joined: Fri Feb 27, 2009 9:09 am

Fetch Parallel URL's

Post by gaurav.shetti »

Hi,

I just wanted to know how many parallel URL can be fetched using the command:
<fetch Parallel...>

Is there a limit associated with it.

I am trying to fetch urls that return xml data and would like to fetch in parallel upto 80 URL's.

Can you also explain what exactly would happen if I try retrieving so many URL's at one go. Will it effect the performance?

Regards,
Jitish
User avatar
Kai
Site Admin
Posts: 1270
Joined: Tue Apr 25, 2000 1:27 pm

Fetch Parallel URL's

Post by Kai »

There is no hard limit, but do not set the PARALLEL value too high (it defaults to 10, but 2 or 3 would be recommended instead, if the list of URLs is largely across one or two sites), as that is the number of *simultaneous* connections established, which is the biggest driver of resource usage (network bandwidth and client memory).

If there are more than PARALLEL number of URLs in the list, the remaining ones are queued up until the first one(s) complete. There is also some client mem used for this waiting queue, but not nearly as much as for the active in-use PARALLEL connections.

The other resource usage consideration is for the target servers. A large PARALLEL value can cause many simultaneous connections to the *same* server (depending on your list ordering). This may consume more resources on that server and slow it down, even if the client (doing the <fetch>) might be able to handle more PARALLEL pages.
Post Reply