make it easier to test with mainnet data #188

Open
opened 2018-08-09 18:43:41 +02:00 by BrannonKing · 5 comments
BrannonKing commented 2018-08-09 18:43:41 +02:00 (Migrated from github.com)

Includes:

  1. Running the daemon in "regtest" mode but using mainnet data.

...

Includes: 1. Running the daemon in "regtest" mode but using mainnet data. ...
BrannonKing commented 2018-08-14 00:32:07 +02:00 (Migrated from github.com)

I considered three different approaches for this:

  1. Pass in an additional (with -regtest) command-line parameter that copies over the current "mainnet" saved data wholesale.
  2. Make a separate "devtools" script to do #1.
  3. Make an RPC command to transfer in x number of blocks from the mainnet or testnet saved data.

I'm planning on #3. It seems to be the most flexible although it's a bit more work. Things that I think we should not do: make the regtest mode connect to any network, allow simultaneous file access. (All three options require that the data being copied is not presently in use.)

I considered three different approaches for this: 1. Pass in an additional (with -regtest) command-line parameter that copies over the current "mainnet" saved data wholesale. 2. Make a separate "devtools" script to do #1. 3. Make an RPC command to transfer in x number of blocks from the mainnet or testnet saved data. I'm planning on #3. It seems to be the most flexible although it's a bit more work. Things that I think we should not do: make the regtest mode connect to any network, allow simultaneous file access. (All three options require that the data being copied is not presently in use.)
kauffj commented 2018-08-14 15:18:58 +02:00 (Migrated from github.com)

Have you looked at approaches taken by others? Seems like something to have come up for bitcoin, though I did google briefly to no success.

Have you looked at approaches taken by others? Seems like something to have come up for bitcoin, though I did google briefly to no success.
BrannonKing commented 2018-08-27 23:40:59 +02:00 (Migrated from github.com)

After the question, I did look around for other approaches but found nothing. After attempting to write the copy in C++ land, I gave up as I was unable to update the address prefix bytes on incoming data without destroying all the hashes -- essentially making it useless. I ended up with the script approach for the transfer.

After the question, I did look around for other approaches but found nothing. After attempting to write the copy in C++ land, I gave up as I was unable to update the address prefix bytes on incoming data without destroying all the hashes -- essentially making it useless. I ended up with the script approach for the transfer.
BrannonKing commented 2018-08-27 23:51:51 +02:00 (Migrated from github.com)

From earlier conversations it sounded like copying over all the supports was less necessary; this was more about easily testing memory usage given a large number of claims. The proposed scripts don't add any supports.

From earlier conversations it sounded like copying over all the supports was less necessary; this was more about easily testing memory usage given a large number of claims. The proposed scripts don't add any supports.
BrannonKing commented 2019-09-07 00:15:51 +02:00 (Migrated from github.com)

If we import the data (claim names & values & supports), but it's all tied to the same user (with millions of addresses) -- is there any value in that?

If we import the data (claim names & values & supports), but it's all tied to the same user (with millions of addresses) -- is there any value in that?
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: LBRYCommunity/lbrycrd#188
No description provided.