tus: reduce chunk size (25MB -> 10MB)

With the throughput tweaks at the backend, it seems like the number of "file is locked" errors have reduced.

The next thing to try is to reduce the chunk size, hoping that file writes would be faster, reducing the lock duration from causing a timeout.
This commit is contained in:
infinite-persistence 2022-01-15 14:44:15 +08:00
parent 0b5f10c508
commit d14470830c
No known key found for this signature in database
GPG key ID: B9C3252EDC3D0AA0

View file

@ -7,7 +7,7 @@ import { LBRY_WEB_PUBLISH_API_V2 } from 'config';
const RESUMABLE_ENDPOINT = LBRY_WEB_PUBLISH_API_V2; const RESUMABLE_ENDPOINT = LBRY_WEB_PUBLISH_API_V2;
const RESUMABLE_ENDPOINT_METHOD = 'publish'; const RESUMABLE_ENDPOINT_METHOD = 'publish';
const UPLOAD_CHUNK_SIZE_BYTE = 25 * 1024 * 1024; const UPLOAD_CHUNK_SIZE_BYTE = 10 * 1024 * 1024;
const STATUS_CONFLICT = 409; const STATUS_CONFLICT = 409;
const STATUS_LOCKED = 423; const STATUS_LOCKED = 423;