Merge #16464: [qa] Ensure we don't generate a too-big block in p2sh sigops test

bf3be5297a [qa] Ensure we don't generate a too-big block in p2sh sigops test (Suhas Daftuar)

Pull request description:

  There's a bug in the loop that is calculating the block size in the p2sh sigops test -- we start with the size of the block when it has no transactions, and then increment by the size of each transaction we add, without regard to the changing size of the encoding for the number of transactions in the block.

  This might be fine if the block construction were deterministic, but the first transaction in the block has an ECDSA signature which can be variable length, so we see intermittent failures of this test when the initial transaction has a 70-byte signature and the block ends up being one byte too big.

  Fix this by double-checking the block size after construction.

ACKs for top commit:
  jonasschnelli:
    utACK bf3be5297a
  jnewbery:
    tested ACK bf3be5297a

Tree-SHA512: f86385b96f7a6feafa4183727f5f2c9aae8ad70060b574aad13b150f174a17ce9a0040bc51ae7a04bd08f2a5298b983a84b0aed5e86a8440189ebc63b99e64dc
This commit is contained in:
MarcoFalke 2019-07-28 10:12:46 -04:00
commit 3489b71512
No known key found for this signature in database
GPG key ID: D2EA4850E7528B25

View file

@ -486,6 +486,14 @@ class FullBlockTest(BitcoinTestFramework):
tx_last = tx_new
b39_outputs += 1
# The accounting in the loop above can be off, because it misses the
# compact size encoding of the number of transactions in the block.
# Make sure we didn't accidentally make too big a block. Note that the
# size of the block has non-determinism due to the ECDSA signature in
# the first transaction.
while (len(b39.serialize()) >= MAX_BLOCK_BASE_SIZE):
del b39.vtx[-1]
b39 = self.update_block(39, [])
self.send_blocks([b39], True)
self.save_spendable_output()