I've got two large databases that I need to retrieve data from based on user queries. Once I get the data back I need to order according to relevance or date and present it to the user as though it came from the same source. To make matters worse the user will be able to specify how many results are shown per page and page through them X at a time.
My first thoughts were to retrieve all the data from each and then sort them together, but this raises an interesting question. Will rankings from a likep query be only valid within a result set? In other words, if I get a match with a ranking of 780 from one database will a ranking of 790 from the other database imply that it is more relevant?
Is anyone out there doing this sort of job? I'm wondering if it wouldn't be more efficient to distill the content and put it into a third database with pointers to the source database. Then I could just hit the aggregate db and only have to bother the source db when someone wanted to view a match. Of course this will lose some of the accuracy which is something much prized for this application.
Any suggestions?
My first thoughts were to retrieve all the data from each and then sort them together, but this raises an interesting question. Will rankings from a likep query be only valid within a result set? In other words, if I get a match with a ranking of 780 from one database will a ranking of 790 from the other database imply that it is more relevant?
Is anyone out there doing this sort of job? I'm wondering if it wouldn't be more efficient to distill the content and put it into a third database with pointers to the source database. Then I could just hit the aggregate db and only have to bother the source db when someone wanted to view a match. Of course this will lose some of the accuracy which is something much prized for this application.
Any suggestions?