This commit makes use of the new ScriptDiscourageUpgradableNops flag to
reject execution of NOP1 through NOP10 for transactions that are
considered standard.
This mirrors the behavior added to Bitcoin Core via pull request 5000.
This commit modifies various regression tests to make them more consistent
with other tests throughout the code base.
Also, it allows of all the tests to run in parallel.
NOP1 through NOP10 are reserved for future soft-fork upgrades. When
such an upgrade occurs, the NOP argument will then require verification.
Rejecting transactions that contain these NOPs into the mempool will
discourage those transactions from being mined elsewhere and ensure
btcd will never mine such transactions. This prevents now
invalid scripts (according to the majority of hashing power) even if the
client has not yet upgraded.
Non-executed upgradable NOPs are still allowed as they will still be
valid post-upgrade.
Mimics Bitcoin Core commit 03914234b3c9c35d66b51d580fe727a0707394ca
BIP0062 defines specific rules and canonical encodings for data pushes.
The existing script builder code already conformed to all but one of the
canonical data push rules that was added after it was originally
implemented (adding a single byte of 0x81 must be converted to
OP_1NEGATE). This commit implements that case and expands the existing
tests to explicitly cover all cases mentioned in BIP0062.
In addition, as a part of this change, the AddData function has been
modified so that any attempt to push more than the maximum script element
size bytes (520) in one push or any pushes the would cause the script to
exceed the maximum script bytes allowed by the script engine (10000) will
result in the final call to the Script function to only return the script
up to the point of the first error along with the error. This change
should have little effect on existing callers since they are almost
positively not creating scripts which violate these rules as they could
never be executed, however it does mean they need to check the new error
return.
Since the regression tests intentionally need to be able to exceed that
limit, a new function named AddFullData has been added which does not
enforce the limits, but still provides canonical encoding of the pushed
data.
Note that this commit does not affect consensus rules nor modify the
script engine.
Also, the tests have been marked so they can run in parallel.
This utility is a relic from initial development before it was possible to
request blocks and transactions via RPC. The correct way to do this now
is by using RPC since that works while btcd is operating unlike the
utility which requires an exclusive lock on the database.
Currently, the reference client bans peers that send alerts not signed
with its key. We could verify against their key, but since the
reference client developers are currently unwilling to support other
implementations' alert messages, we will not relay theirs.
This commit contains the entire btcdb repository along with several
changes needed to move all of the files into the database directory in
order to prepare it for merging. This does NOT update btcd or any of the
other packages to use the new location as that will be done separately.
- All import paths in the old btcdb test files have been changed to the
new location
- All references to btcdb as the package name have been chagned to
database
- The coveralls badge has been removed since it unfortunately doesn't
support coverage of sub-packages
This is ongoing work toward #214.
This was previously hard-coded to zero instead of using the offset
provided by the median time source which takes time samples from the other
connected nodes.
Only two operations are performed with this data structure: adding to
the back and removing from the front. Because middle inserts and
deletions are never needed, a linked list results in overall worse
performance due to an extra allocation for each element's node, worse
cache locality, and the runtime cost of boxing/unboxing each item
during accesses.
On top of the performance gains, a slice is more type safe as it is a
true generic data structure making it is impossible to insert or
access an element with the wrong type.