make it easier to test with mainnet data #188
Labels
No labels
area: devops
area: discovery
area: docs
area: livestream
area: proposal
consider soon
Epic
good first issue
hacktoberfest
hard fork
help wanted
icebox
Invalid
level: 0
level: 1
level: 2
level: 3
level: 4
needs: exploration
needs: grooming
needs: priority
needs: repro
needs: tech design
on hold
priority: blocker
priority: high
priority: low
priority: medium
resilience
soft fork
Tom's Wishlist
type: bug
type: discussion
type: improvement
type: new feature
type: refactor
type: task
type: testing
unplanned
work in progress
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: LBRYCommunity/lbrycrd#188
Loading…
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Includes:
...
I considered three different approaches for this:
I'm planning on #3. It seems to be the most flexible although it's a bit more work. Things that I think we should not do: make the regtest mode connect to any network, allow simultaneous file access. (All three options require that the data being copied is not presently in use.)
Have you looked at approaches taken by others? Seems like something to have come up for bitcoin, though I did google briefly to no success.
After the question, I did look around for other approaches but found nothing. After attempting to write the copy in C++ land, I gave up as I was unable to update the address prefix bytes on incoming data without destroying all the hashes -- essentially making it useless. I ended up with the script approach for the transfer.
From earlier conversations it sounded like copying over all the supports was less necessary; this was more about easily testing memory usage given a large number of claims. The proposed scripts don't add any supports.
If we import the data (claim names & values & supports), but it's all tied to the same user (with millions of addresses) -- is there any value in that?