## Issue
- Closes 592 Force clear stuck upload
- It was possible for an upload to stay "locked" e.g. when browser is killed.
## Change
When refreshing or opening a new tab, always clear the locks. The on-going sessions will re-lock them immediately.
`GetLinksData` is somewhat expensive. The value won't change until user changes the window size or selects another homepage.
As we can't call an `effect` within a `memo`, we had to extract out `isLargeScreen` as an input parameter, which is fine as it makes `GetLinksData` more functional (functional programming).
* Upload: fix redux key clash
## Issue
`params` is the "final" value that will be passed to the SDK and `channel` is not a valid argument (it should be `channel_name`). Also, it seems like we only pass the channel ID now and skip the channel name entirely.
For the anonymous case, a clash will still happen when since the channel part is hardcoded to `anonymous`.
## Approach
Generate a guid in `params` and use that as the key to handle all the cases above. We couldn't use the `uploadUrl` because v1 doesn't have it.
The old formula is retained to allow users to retry or cancel their existing uploads one last time (otherwise it will persist forever). The next upload will be using the new key.
* Upload: add tab-locking
## Issue
- The previous code does detect uploads from multiple tabs, but it was done by handling the CONFLICT error message from the backend. At certain corner-cases, this does not work well. A better way is to not allow resumption while the same file is being uploading from another tab.
- When an upload from 1 tab finishes, the GUI on the other tab does not remove the completed item. User either have to refresh or click Cancel. Clicking Cancel results in the 404 backend error. This should be avoided.
## Approach
- Added tab synchronization and locking by passing the "locked" and "removed" information through `localStorage`.
## Other considered approaches
- Wallet sync -- but decided not to pollute the wallet.
- 3rd-party redux tab syncing -- but decided it's not worth adding another module for 1 usage.
* Upload: check if locked before confirming delete
## Reproduce
Have 2 tabs + paused upload
Open "cancel" dialog in one of the tabs.
Continue upload in other tab
Confirm cancellation in first tab
Upload disappears from both tabs, but based on network traffic the upload keeps happening.
(If upload finishes the claim seems to get created)
* Fix query selection
* Fix xml format
* Fix link url and author_url
* Refactor repeated components
* Refactor repeated embed iframe string
* Add support for passing referrer queries to src
* Change iframe id from lbry to odysee
* Improve replace logic understanding
* Fix URL
Co-authored-by: Thomas Zarebczan <tzarebczan@users.noreply.github.com>
The recent change to parse the channel from a Repost using a `claim` ended up breaking the case for abandoned claims, which `claim` will be `null`.
Fix by looking at `claim` first (faster), and falling back to the `parseURI` method if it remains inconclusive.
* Add remove_duplicates to tile/list claim_search except for Channel Page
This removes the any duplicates from reposts.
* Re-activate the "Hide reposts" setting
* Category Rows: default to ['stream', 'repost'] unless specified otherwise.
* Exclude default homepage data at compile time
The youtuber IDs alone is pretty huge, and is unused in the `CUSTOM_HOMEPAGE=true` configuration.
* Remove Desktop items and other cleanup
- Moved constants out of the component.
- Remove SIMPLE_SITE check.
- Remove Desktop-only items
* Sidebar: limit subscription and tag section
## Issue
Too slow for huge lists
## Change
Limit to 10 initially, and load everything on "Show more"
* Fix makeSelectThumbnailForUri
- Fix memo
- Expose function to extract directly from claim if client already have it.
* Publish button: use spinner instead of "Publishing..."
Looks better, plus the preview could take a while sometimes.
* Refactor `doPublish`. No functional change
This is to allow `doPublish` to accept a custom payload as an input (for resuming uploads), instead of always resolving it from the redux data.
* Add doPublishResume
* Support resume-able upload via tus
## Issue
38 Handle resumable file upload
## Notes
Since we can't serialize a File object, we'll need to the user to re-select the file to resume.
* Exclude "modified date" for Firefox/Android
## Issue
It appears that the modification date of the Android file changes when selected, so that file was deemed "different" when trying to resume upload.
## Change
Exclude modification date for now. Let's assume a smart user.
* Move 'currentUploads' to 'publish' reducer
`publish` is currently rehydrated, so we can ride on that and don't need to store the `currentUploads` in `localStorage` for persistence. This would allow us to store Markdown Post data too, as `localStorage` has a 5MB limit per app.
We could have also made `webReducer` rehydrate, but in this repo, there is no need to split it to another reducer. It also makes more sense to be part of publish anyway (at least to me).
This change is mostly moving items between files, with the exception of
1. An additional REHYDRATE in the publish reducer to clean up the tusUploader.
2. Not clearing `currentUploads` in CLEAR_PUBLISH.
* Restore v1 code for livestream replay, etc.
v2 (tus) does not handle `remote_url`, so the app still needs v1 for that. Since we'll still have v1 code, use v1 for previews as well.
* Fix error logs
* Improve LBC sticker flow/clarity
* Show inline error if custom sticker amount below min
* Sort emojis alphabetically
* Improve loading of Images
* Improve quality and display of emojis and fix CSS
* Display both USD and LBC prices
* Default to LBC tip if creator can't receive USD
* Don't clear text-field after sticker is sent
* Refactor notification component
* Handle notifications
* Don't show profile pic on sticker livestream comments
* Change Sticker icon
* Fix wording and number rounding
* Fix blurring emojis
* Disable non functional emote buttons
* Prevent multiple parseURI calls
## Ticket
129
## Issue
Code was shortened to use `isURIValid` during the consolidation. `isURIValid` calls `normalizeURI`, which calls another `parseURI`.
`parseURI` is pretty expensive.
## Approach
- Add optional parameter to `isURIValid` to skip the normalization.
- Set those that were converted during the consolidation to skip the normalization. Also covered a few other instances where it is obvious to me that normalization is not required.
- For the rest, I can't tell for sure if it's safe to remove the normalization, so the default `normalize=true` will leave things as is.
The whole `parseURI` probably needs a refactoring, or a few lighter version for specific needs.
* Simplify isURIEqual
## Issue
`parseURI` is too expensive to be used in a loop, plus `normalizeURI` itself is calling `parseURI`.
## Approach
Not sure if it covers all cases, but just try convert colons to hashes before comparing.
* Refactor filePrice
* Refactor Wallet Tip Components
* Add backend sticker support for comments
* Add stickers
* Refactor commentCreate
* Add Sticker Selector and sticker comment creation
* Add stickers display to comments and hyperchats
* Fix wrong checks for total Super Chats
## Issue
We previously automatically reload when there is a chunk error. This works fine if it's the case of new code was pushed recently while the user was active. But if the failure was caused by other things like network problems or the file IS actually missing, we end up in an infinite loop of refreshes.
## New approach
Tell the user to reload instead of automatically doing it.
* Fix CSS for live chat embeds
* Fix Markdown Lists in Comments
* Disable copy link menu option on livestream comments
* Fix nested indents in Live Chat
* Fix mentions and timestamps not parsed in bullet lists
* Highlight livestream comment and menu button on hover
* Fix mention parsing
* adding functionality to detect user download speed
* calculating bandwidth speed more intelligently
* saving download speed and updating it every 30s
* all the functionality should be done needs testing
* fix linting
* use a 1mb file for calculating bandwidth
* add optional chaining plugin to babel and get bitrate from texttrack
* allow optional chaining for flow
* ignore flow error
* disable bandwidth checking functionality
* fix flow error
## Issue
.../archives/C02FQBM00Q0/p1633044695010600
## Changes
When querying a search key, it has to be an exact match. This was broken by the insertion of `free_only` in the fetch.
Added a function to generate the options, so that all clients stay in sync.
* Add Channel Mention selection ability
* Fix mentioned user name being smaller than other text
* Improve logic for locating a mention
* Fix mentioning with enter on livestream
* Fix breaking for invalid URI query
* Handle punctuation after mention
* Fix name display and appeareance
* Use canonical url
* Fix missing search
* ❌ Remove old method of displaying active livestreams
Completely remove it for now to make the commit deltas clearer.
We'll replace it with the new method at the end.
* Fetch and store active-livestream info in redux
* Tiles can now query active-livestream state from redux instead of getting from parent.
* ⏪ ClaimTilesDiscover: revert and cleanup
## Simplify
- Simplify to just `uris` instead of having multiple arrays (`uris`, `modifiedUris`, `prevUris`)
- The `prevUris` is for CLS prevention. With this removal, the CLS issue is back, but we'll handle it differently later.
- Temporarily disable the view-count fetching. Code is left there so that I don't forget.
## Fix
- `shouldPerformSearch` was never true when `prefixUris` is present. Corrected the logic.
- Aside: prefix and pin is so similar in function. Hm ....
* ClaimTilesDiscover: factor out options
## Change
Move the `option` code outside and passed in as a pre-calculated prop.
## Reason
To skip rendering while waiting for `claim_search`, we need to add `React.memo(areEqual)`. However, the flag that determines if we are fetching `claim_search` (fetchingClaimSearchByQuery[]) depends on the derived options as the key.
Instead of calculating `options` twice, we moved it to the props so both sides can use it.
It also makes the component a bit more readable.
The downside is that the prop-passing might not be clear.
* ClaimTilesDiscover: reduce ~17 renders at startup to just 2.
* ClaimTilesDiscover: fill with placeholder while waiting for claim_search
## Issue
Livestream claims are fetched seperately, so they might already exists. While claim_search is running, the list only consists of livestreams (collapsed).
## Fix
Fill up the space with placeholders to prevent layout shift.
* Add 'useFetchViewCount' to handle fetching from lists
This effect also stashes fetched uris, so that we won't re-fetch the same uris during the same instance (e.g. during infinite scroll).
* ⏪ ClaimListDiscover: revert and cleanup
## Revert
- Removed the 'finalUris' stuff that was meant to "pause" visual changes when fetching. I think it'll be cleaner to use React.memo to achieve that.
## Alterations
- Added `renderUri` to make it clear which array that this component will render.
- Re-do the way we fetch view counts now that 'finalUris' is gone. Not the best method, but at least correct for now.
* ClaimListDiscover: add prefixUris, similar to ClaimTilesDiscover
This will be initially used to append livestreams at the top.
* ✅ Re-enable active livestream tiles using the new method
* doFetchActiveLivestreams: add interval check
- Added a default minimum of 5 minutes between fetches. Clients can bypass this through `forceFetch` if needed.
* doFetchActiveLivestreams: add option check
We'll need to support different 'orderBy', so adding an "options check" when determining if we just made the same fetch.
* WildWest: limit livestream tiles + add ability to show more
Most likely this behavior will change in the future, so we'll leave `ClaimListDiscover` untouched and handle the logic at the page level.
This solution uses 2 `ClaimListDiscover` -- if the reduced livestream list is visible, it handles the header; else the normal list handles the header.
* Use better tile-count on larger screens.
Used the same method as how the homepage does it.