Fix missing claims in large collection (#726)

## Issue
A huge list like http://localhost:9090/$/list/d91815c1bc8c80a1f354284a8c8e92d84d5f07a6 (193 items) often produces fewer results in the final rendered list.

## Cause
The same list of IDs was passed into `claim_search`, and we just increment the `page`. However, it seems like `claim_search` does not honor the order between each call, and some results from the previous page took the place of the results in the next page.  The total is the same, but there are now duplicates.

## Fix
When batching, slice the ID list ourselves to bypass the problem for now.
This commit is contained in:
infinite-persistence 2022-01-18 06:33:01 -08:00 committed by GitHub
parent e890e2f4f8
commit e4658bb044
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -160,8 +160,8 @@ export const doFetchItemsInCollections = (
for (let i = 0; i < Math.ceil(totalItems / batchSize); i++) { for (let i = 0; i < Math.ceil(totalItems / batchSize); i++) {
batches[i] = Lbry.claim_search({ batches[i] = Lbry.claim_search({
claim_ids: claim.value.claims, claim_ids: claim.value.claims.slice(i * batchSize, (i + 1) * batchSize),
page: i + 1, page: 1,
page_size: batchSize, page_size: batchSize,
no_totals: true, no_totals: true,
}); });