Publish: restore the multiple retries (#1763)
- Previously, we tried to solve the "file locked" problem by only making one retry after a super long delay. This was from an anecdote that it's more likely to lock up if the delay was short. - This didn't help at all for our case, and Andrey has made some locking mechanism changes in the backend. - The reduced number of retries probably increased the number of "failed to upload chunk" errors (not sure), which is supposedly a normal occurrence and we're expected to keep retrying. Restoring the retry behavior and monitor...
This commit is contained in:
parent
96cdf11567
commit
0a88c6254d
1 changed files with 1 additions and 1 deletions
|
@ -70,7 +70,7 @@ export function makeResumableUploadRequest(
|
|||
const uploader = new tus.Upload(file, {
|
||||
...urlOptions,
|
||||
chunkSize: UPLOAD_CHUNK_SIZE_BYTE,
|
||||
retryDelays: [122000],
|
||||
retryDelays: [8000, 10000, 15000, 20000, 30000],
|
||||
parallelUploads: 1,
|
||||
storeFingerprintForResuming: false,
|
||||
urlStorage: new NoopUrlStorage(),
|
||||
|
|
Loading…
Reference in a new issue