summa-tx / bitcoin-spv Goto Github PK
View Code? Open in Web Editor NEWutilities for Bitcoin SPV proof verification on (many!) other chains
License: Other
utilities for Bitcoin SPV proof verification on (many!) other chains
License: Other
I'm trying to compiling the swap-demo.c via make risc
Got this error:
csrc/swap-demo.c: In function 'load_args':
csrc/swap-demo.c:70:3: error: unknown type name 'mol_seg_t'
70 | mol_seg_t script_seg;
| ^~~~~~~~~
it looks like missing include "molecule_builder.h"
in the swap-demo.c
file, am i missing anything on setup?
Hi guys,
It would be useful to add a getter method for retrieving a Transaction from SPVStore.sol
, similar to getTransactionInput
. Perhaps something like:
getTransaction(bytes32 _txid) public view returns (bool, uint32, uint8, uint8) {
return (
transactions[_txid].validationComplete,
transactions[_txid].locktime,
transactions[_txid].numInputs
transactions[_txid].numOutputs);
}
Would it make sense to add something like this? If so, I'd be happy to contribute it. Thanks!
Quality of life improvements to testing:
testVectors.json
. E.g. ALWAYS use solidityError
and goError
and jsError
instead of always shoving them into errorMessage
testVectors.json
We have bitcoin-spv-sol
and bitcoin-spv-js
npm packages.
right now, the solidity only provides the contracts. It would be nice to take utilities we've made for JS (e.g. proof (de)serialization) and re-export it from the solidity npm package
We added a few switch/case statements and the new address encoding functions to the utils file
Hi! Thanks for the interesting library. I'm attempting to run as detailed in the readme but running into an issue.
When running:
pipenv run python scripts/merkle.py 10e3eaed1ef21787944f0d151a1a6553397e2ee4074887e344a112e13f22b70b
I get:
Tx is too large. Expect less than 100kB. Got: 153762 bytes
Am I doing this correctly?
I'm using Python 3.6.6 on osx
Thanks!
The library exposes redundant implementations of bitcoins double sha256
.
solidity native implementation with an overzealous type correction #16
bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 110 to 116 in f5e4da0
assembly implementation
Note this implementation does not handle errors when staticcall'ing the precompiled sha256
contract (private chains).
bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 118 to 129 in f5e4da0
We recommend providing only one implementation for calculating the double sha256
as maintaining two interfaces for the same functionality is not desirable. Furthermore, even though the assembly implementation is saving gas, we recommend keeping the language provided implementation.
extractHash
checks that the template for P2PKH
, P2SH
, P2PKH
and P2WSH
is valid (proper prefix and suffix), but the actual size of the provided script is never checked.
bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 406 to 434 in 3a35db8
Therefore a script encoded under the following format will be accepted:
P2PKH: 1976a914 <script_with_more_than_20_bytes> 88ac
P2SH: 17a914 <script_with_more_than_20_bytes> 87
P2PKH: 0014 <script_with_more_than_20_bytes>
P2PKH: 0020 <script_with_more_than_32_bytes>
This can cause issues if the provided data does not come from a validated transaction, for instance if the data is being provided directly from a user.
This is related to keep-network/tbtc#658
The following call won't be caught, even though the script is invalid (notice the extra 88
by the end of the script):
extractHash(0x4062b007000000001976a914f86f0bc0a2232970ccdf4569815db500f12683618888ac)
Add an additional check for maliciously formatted scripts to check the actual length of the provided script.
I tried to run:
pipenv run python scripts/merkle.py 79cbf066c8fd6f5503ba5ca11499a205c32280337d5d595aa05f958f2badb624
It failed with a following error stack trace:
Traceback (most recent call last):
File "scripts/merkle.py", line 227, in <module>
main(tx_id, num_headers)
File "scripts/merkle.py", line 219, in main
asyncio.get_event_loop().run_until_complete(do_it_all(tx_id, num_headers))
File "/Users/jakub/.pyenv/versions/3.6.6/lib/python3.6/asyncio/base_events.py", line 468, in run_until_complete
return future.result()
File "scripts/merkle.py", line 180, in do_it_all
(tx_json, t) = await get_tx(tx_id)
File "scripts/merkle.py", line 106, in get_tx
latest_blockheight = await get_latest_blockheight()
File "scripts/merkle.py", line 88, in get_latest_blockheight
return block_dict['block_height']
KeyError: 'block_height'
block_height
should be replaced with height
.
Sample block_dic
received inside get_latest_blockheight()
function:
{'hex': '00008020becf7e0d9f907582aa71142bf1d4eb6d94c1d3778246290000000000000000000f5490fb4c1a11327d806844f76b817410ade60f6d82c47533ae249e2aee9c7bd470e65c45fb2917b138e8c8', 'height': 577391}
ValidateSPV.validateHeaderChain()
returns one of three different errors if it fails.
export function validateHeaderChain(headers) {
// Check header chain length
if (headers.length % 80 !== 0) {
throw new TypeError('Header bytes not multiple of 80.');
}
let digest;
let totalDifficulty = BigInt(0);
for (let i = 0; i < headers.length / 80; i += 1) {
// ith header start index and ith header
const start = i * 80;
const header = utils.safeSlice(headers, start, start + 80);
// After the first header, check that headers are in a chain
if (i !== 0) {
if (!validateHeaderPrevHash(header, digest)) {
throw new Error('Header bytes not a valid chain.');
}
}
// ith header target
const target = BTCUtils.extractTarget(header);
// Require that the header has sufficient work
digest = BTCUtils.hash256(header);
if (!validateHeaderWork(digest, target)) {
throw new Error('Header does not meet its own difficulty target.');
}
totalDifficulty += BTCUtils.calculateDifficulty(target);
}
return totalDifficulty;
}
When using this externally, this means having to do a switch/case
to handle for each of the different errors messages. Comparing string error messages is prone to breakage if the message ever changes, and not optimal.
// Current implementation when using externally
switch (e.message) {
case 'Header bytes not multiple of 80.':
return [false, errors.ERR_INVALID_CHAIN];
case 'Header bytes not a valid chain.':
return [false, errors.ERR_INVALID_PREV_BLOCK];
case 'Header does not meet its own difficulty target.':
return [false, errors.ERR_INVALID_PROOF_OF_WORK];
default:
return [false, errors.ERR_INVALID_CHAIN];
}
Suggestions:
1
is header length error2
is invalid chain3
is doesn't meet difficulty target// custom error code implementation
switch (e.message) {
case 1:
return [false, errors.ERR_INVALID_CHAIN];
case 2:
return [false, errors.ERR_INVALID_PREV_BLOCK];
case 3:
return [false, errors.ERR_INVALID_PROOF_OF_WORK];
default:
return [false, errors.ERR_INVALID_CHAIN];
}
throw new TypeError('Header bytes not multiple of 80.')
throw new Error('Header bytes not a valid chain.')
throw new RangeError('Header does not meet its own difficulty target.')
The user can compare types of errors and handle it appropriately.
// Error types implementation
switch (e) {
case instanceof TypeError:
return [false, errors.ERR_INVALID_CHAIN];
case instanceof RangeError:
return [false, errors.ERR_INVALID_PROOF_OF_WORK];
case instanceof Error:
return [false, errors.ERR_INVALID_PREV_BLOCK];
default:
return [false, errors.ERR_INVALID_CHAIN];
}
The obvious disadvantage to this is there is a limited amount of error types, and the default
is never going to be reached since all error types are Error
s. New error types can be constructed, which might be a nice combination of both of these suggestions.
PrevBlockError
PrevHashError
MerkleRootError
Maybe we can compile some sort of shared bitcoin/blockchain specific error codes that can be used across projects.
Currently, contracts have fixed pragma to 0.4.25
. It would be super usefull to have the contracts supporting solc 0.5.x
.
I tried to bump the pragma to >=0.4.25 <0.6.0;
but encountered a few issues, e.g.
bitcoin-spv/contracts/BTCUtils.sol:44:33: TypeError: Explicit type conversion not allowed from "bytes1" to "uint256".
_number = _number + uint(_b[i]) * (2 ** (8 * (_b.length - (i + 1))));
bitcoin-spv/contracts/ValidateSPV.sol:66:41: TypeError: Return arguments required.
if (!validatePrefix(_prefix)) { return; }
Turns out this was unnecessary the whole time ๐
We should error on these instead of accept them.
Missing tests for:
Other methods that are not tested, but are used by other methods that pass their own tests, so I'm not sure if tests for these are necessary:
I tried to run:
pipenv run python scripts/merkle.py 79cbf066c8fd6f5503ba5ca11499a205c32280337d5d595aa05f958f2badb624
It failed with a following error stack trace:
Traceback (most recent call last):
File "scripts/merkle.py", line 227, in <module>
main(tx_id, num_headers)
File "scripts/merkle.py", line 220, in main
asyncio.get_event_loop().run_until_complete(do_it_all(tx_id, num_headers))
File "/Users/jakub/.pyenv/versions/3.6.6/lib/python3.6/asyncio/base_events.py", line 468, in run_until_complete
return future.result()
File "scripts/merkle.py", line 182, in do_it_all
(tx_json, t) = await get_tx(tx_id)
File "scripts/merkle.py", line 104, in get_tx
client = await get_client()
File "scripts/merkle.py", line 36, in get_client
CLIENT = await setup_client()
File "scripts/merkle.py", line 72, in setup_client
timeout=5)
File "/Users/jakub/.pyenv/versions/3.6.6/lib/python3.6/asyncio/tasks.py", line 358, in wait_for
return fut.result()
connectrum.exc.ElectrumErrorResponse: ({'code': 1, 'message': 'server.version already sent'}, {'id': 3, 'method': 'server.version', 'params': ('bitcoin-spv-merkle', '1.2')})
Change to server.version
electrum method has been introduced to be accepted only once: kyuupichan/electrumx@5ba5d05
The server.version
call is already performed by connectrum client on connect https://github.com/coinkite/connectrum/blob/master/connectrum/client.py#L167.
The heavily used library function BytesLib.toBytes32()
unnecessarily casts _source
to bytes
(same type) and creates a copy of the dynamic byte array to check it's length, while this can be done directly on the user-provided bytes _source
.
bitcoin-spv/solidity/contracts/BytesLib.sol
Lines 399 to 408 in f5e4da0
function toBytes32(bytes memory _source) pure internal returns (bytes32 result) {
if (_source.length == 0) {
return 0x0;
}
assembly {
result := mload(add(_source, 32))
}
}
New test vector:
In retargetAlgorithm
Input should use blocks 66528, 68543, and 68544.
output is the difficulty of 68544
Should be able to get all the info from the bc.i rawblock endpoint
https://blockchain.info/rawblock/0000000000519051eb5f3c5943cdbc176a0eff4e1fbc3e08287bdb76299b8e5c
The bitcoin-spv library allows for multiple integer under-/overflows while processing or converting potentially untrusted or user-provided data.
uint8
underflow uint256(uint8(_e - 3))
_header[75]
will throw consuming all gas if out of bounds while the majority of the library usually uses slice(start, 1)
to handle this more gracefully.bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 483 to 494 in f5e4da0
uint8
overflow uint256(uint8(_len + 8 + 1))
VarInt
is implemented for inputs, it is not for the output length.bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 295 to 304 in f5e4da0
uint8
underflow uint256(uint8(extractOutputScriptLen(_output)[0]) - 2)
bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 366 to 378 in f5e4da0
BytesLib
input validation multiple start+length overflowstart+length > start && bytes.length >= start+length
bitcoin-spv/solidity/contracts/BytesLib.sol
Lines 246 to 248 in f5e4da0
BytesLib
input validation multiple start overflowbitcoin-spv/solidity/contracts/BytesLib.sol
Lines 280 to 281 in f5e4da0
bitcoin-spv/solidity/contracts/BytesLib.sol
Lines 269 to 270 in f5e4da0
bitcoin-spv/solidity/contracts/BytesLib.sol
Lines 246 to 248 in f5e4da0
bitcoin-spv/solidity/contracts/BytesLib.sol
Lines 410 to 412 in f5e4da0
We believe that a general-purpose parsing and verification library for bitcoin payments should be very strict when processing untrusted user input. With strict we mean, that it should rigorously validate provided input data and only proceed with the processing of the data if it is within a safe-to-use range for the method to return valid results. Relying on the caller to provide pre-validate data can be unsafe especially if the caller assumes that proper input validation is performed by the library.
Given the risk profile for this library, we recommend a conservative approach that balances for security instead of gas efficiency without relying on certain calls or instructions to throw on invalid input.
For this issue specifically, we recommend proper input validation and explicit type expansion where necessary to prevent values from wrapping or processing data for arguments that are not within a safe-to-use range.
CheckBitcoinSigs.accountFromPubkey()
casts the bytes32
keccack256 hash of the pubkey
to uint256
, then uint160
and then finally to address
while the intermediate cast is not required.
bitcoin-spv/solidity/contracts/CheckBitcoinSigs.sol
Lines 15 to 25 in f5e4da0
The intermediate cast from uint256
to uint160
can be omitted. Refactor to return address(uint256(_digest))
instead.
The type correction encodePacked().toBytes32()
is not needed as sha256
already returns bytes32
.
bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 114 to 117 in f5e4da0
Refactor to return sha256(abi.encodePacked(sha256(_b)));
to save gas.
The sample CKB lock script is returning ERR_WRONG_PAYEE
. It appears that out_lock_hash
is masked by some value for no reason (e.g. 0xd7 instead of 0xd8).
https://github.com/summa-tx/bitcoin-spv/blob/master/c/csrc/swap-demo.c#L252-L275
To reproduce:
uncomment risc
and risc-test
here
run make
extractOutputScriptLength
assumes the output script varint is only 1 byte long. Update to correct multi-byte varint behavior or remove the method
There were a couple tests that were commented out due to working with the retargetAlgorithm
data structure in testVectors.json
. These need to be fixed.
CheckBitcoinSigs.wpkhSpendSighash
calculates the sighash of a Bitcoin transaction. Among its parameters, it accepts bytes memory _outpoint
, which is a 36-byte UTXO id consisting of a 32-byte transaction hash and a 4-byte output index.
The function in question should not accept an _outpoint
that is not 36-bytes, but no length check is made:
bitcoin-spv/solidity/contracts/CheckBitcoinSigs.sol
Lines 130 to 159 in f5e4da0
Check that _outpoint.length
is 36.
ValidateSPV.validateHeaderChain
takes as input a sequence of Bitcoin headers and calculates the total accumulated difficulty across the entire sequence. The input headers are checked to ensure they are relatively well-formed:
bitcoin-spv/solidity/contracts/ValidateSPV.sol
Lines 173 to 174 in f5e4da0
However, the function lacks a check for nonzero length of _headers
. Although the total difficulty returned would be zero, an explicit check would make this more clear.
If headers.length
is zero, return ERR_BAD_LENGTH
isLegacy
toolingjump
and public for delegatecall
)require
statementsCurrently Vin/Vout parsers error when they contain >0xfc
inputs or outputs. In order to support arbitrary transactions in systems like tBTC, we should update them to allow for large Vins/Vouts.
Prove
using the object/structRules
"0x"
prepended2**53
are required for proofs, so it's safe to use native int
/Number
types on 64-bit platformsVec<u8>
, []byte
, Uint8Array
, or language equivalent when deserializedGolang
JS
Solidity (covered by JS?)
Rust
Python
Minor semantic quibble. Affects all languages. Will affect validate header chain
as well
with the new type refactor, the CLI needs to be refactored
@ErinHales want to take this one?
Hi! It would be really helpful to have library functions that query one of the block explorers and parse the transaction data to generate SPV proofs in the format discussed in the SPV Proof CLI readme (copied below) without having to run a bcoin full node.
{
"version": "01000000",
"vin": "0168fe12339c9eb3ceaf21899b775a53dc10901f88f57ba7b360bbd285fe1b8731000000006a47304402206f33a57c88e473a26eca617c21aa9545101b18df83b127bc485bf0fea1a831b702204d56bc0e39fa14ff254d0677f46c3f7f26f6dea8e6a9c5ba90aace3c8476f1d401210396dd84815a4f121bf29b882c283ec1cd8b5bfa92773008a79cb44d981821a399ffffffff",
"vout": "0230182b00000000001976a91452ada19e1305964e80a1a1cbbafea97f6b632fae88aca7ec3703000000001976a914f9da3787e63ad9261517a6d4d9dabd2cf40fb80988ac",
"locktime": "00000000",
"tx_id": "fff0d18db9f52e6bff445c26cb8bb9658882c8045997a74be26003225713e762",
"tx_id_le": "62e71357220360e24ba7975904c8828865b98bcb265c44ff6b2ef5b98dd1f0ff",
"index": 6,
"confirming_header": {
"raw": "00e0002074859a363c5a885b262722d5c5c5cd912e2cd1a53d0c0c000000000000000000a0c7ea544dcc99af8af5415d4293856da7c07f9f1a3b145d2498b39bf5fa5e36fbd1d45dd1201617d64b8b4c",
"hash": "00000000000000000003e7124509796d4f15ef47fedc71c79cea60d1ff503410",
"hash_le": "103450ffd160ea9cc771dcfe47ef154f6d79094512e703000000000000000000",
"height": 604596,
"prevhash": "0000000000000000000c0c3da5d12c2e91cdc5c5d52227265b885a3c369a8574",
"prevhash_le": "74859a363c5a885b262722d5c5c5cd912e2cd1a53d0c0c000000000000000000",
"merkle_root": "365efaf59bb398245d143b1a9f7fc0a76d8593425d41f58aaf99cc4d54eac7a0",
"merkle_root_le": "a0c7ea544dcc99af8af5415d4293856da7c07f9f1a3b145d2498b39bf5fa5e36"
},
"intermediate_nodes": "c7ca43c16c6a587c3554d45a1c0d56e801ef2dc929fae3cc95bb604b3f566de1f6e9750121695b84989ba819f6cea180315f8a3ae71d644f1bb90379577dbc6d7167faee2d76d172fcec41469346d3d731a901fa86e71528a8569a414ef42ed5585623d28015eb91eb1dda03615196918e49e5f1ff0ef22974..."
}
The script is very expensive, and bad citizenship. It could be done better using connectrum (CoinKite's electrum client. Electrum has blockchain.transaction.get_merkle
, blockchain.transaction.get
, and blockchain.block.headers
which should be sufficient for our purposes
Hi James,
I am running parseAndStoreTransaction with the following transactions but the test is failing with "Unable to determine output length"
'0x0200000000010111b6e0460bb810b05744f8d38262f95fbab02b168b070598a6f31fad438fced4000000001716001427c106013c0042da165c082b3870c31fb3ab4683feffffff0200ca9a3b0000000017a914d8b6fcc85a383261df05423ddf068a8987bf0287873067a3fa0100000017a914d5df0b9ca6c0e1ba60a9ff29359d2600d9c6659d870247304402203b85cb05b43cc68df72e2e54c6cb508aa324a5de0c53f1bbfe997cbd7509774d022041e1b1823bdaddcd6581d7cde6e6a4c4dbef483e42e59e04dbacbaf537c3e3e8012103fbbdb3b3fc3abbbd983b20a557445fb041d6f21cc5977d2121971cb1ce5298978c000000'
'0x010000000001015e8354758db0f3efb35d08fe8b0c8c6927b1fc695e35d0c2fd96adac8f5a89f30100000017160014f08f7ed288a0dd21064432d6f7e963ca67a08e25ffffffff02102700000000000017a914f8c2c2a7610973eafc52b2316b32977d864b7d6c87480d00000000000017a914fbe237abdb365add85052c723ee85df40455e6ba8702483045022100969d23594db6b717ab4b3af7676c1e68fc15211ce9432ba167c732f19539563f0220366492713281f2cea79f7ba8e627f278e8abc25c5e05bd899261920d37268c800121038e4877b8ca31770b2d9fee054f6a47996adc1176891dbad15974935c9de3147b00000000'
'0x01000000000101db6b1b20aa0fd7b23880be2ecbd4a98130974cf4748fb66092ac4d3ceb1a5477010000001716001479091972186c449eb1ded22b78e40d009bdf0089feffffff02b8b4eb0b000000001976a914a457b684d7f0d539a46a45bbc043f35b59d0d96388ac0008af2f000000001976a914fd270b1ee6abcaea97fea7ad0402e8bd8ad6d77c88ac02473044022047ac8e878352d3ebbde1c94ce3a10d057c24175747116f8288e5d794d12d482f0220217f36a485cae903c713331d877c1f64677e3622ad4010726870540656fe9dcb012103ad1d8e89212f0b92c74d23bb710c00662ad1470198ac48c43f7d6f93a2a2687392040000'
I was inspecting the code further and I think the issue might be because of the following function in BTCUtils
function findNumOutputs(bytes _b) public pure returns (uint256) {
return 7 + (41 * extractNumInputs(_b));
}
The offset in this function is set to 41 bytes. This seems to work for native segwit transactions that you had in the tests but for the legacy transactions (p2sh-p2wpkh to p2sh-p2wsh), it is failing. I think the offset for the legacy transactions is around 71 bytes but would love to hear your thoughts and know if I'm doing something wrong!
We have natspec, JSDoc, docstrings, etc. We need a static site build process that's coherent
Add a few more examples for the cli in the golang readme. Preferably find examples that aren't quite so lengthy.
The client still uses BE/LE. Needs to be updated to have only LE
BTCUtils.verifyHash256Merkle
is used by ValidateSPV.prove
to validate a transaction's existence in a Bitcoin block. The function accepts as input a _proof
and an _index
. The _proof
consists of, in order: the transaction hash, a list of intermediate nodes, and the merkle root.
The proof is performed iteratively, and uses the _index
to determine whether the next proof element represents a "left branch" or a "right branch:"
bitcoin-spv/solidity/contracts/BTCUtils.sol
Lines 574 to 586 in f5e4da0
If _idx
is even, the computed hash is placed before the next proof element. If _idx
is odd, the computed hash is placed after the next proof element. After each iteration, _idx
is decremented by _idx /= 2
.
Because verifyHash256Merkle
makes no requirements on the size of _proof
relative to _index
, it is possible to pass in invalid values for _index
that prove a transaction's existence in multiple locations in the tree.
By modifying existing tests, we showed that any transaction can be proven to exist at at least one alternate index. This alternate index is calculated as (2 ** treeHeight) + prevIndex
- though other alternate indices are possible. The modified test is below:
it('verifies a bitcoin merkle root', async () => {
for (let i = 0; i < verifyHash256Merkle.length; i += 1) {
const res = await instance.verifyHash256Merkle(
verifyHash256Merkle[i].input.proof,
verifyHash256Merkle[i].input.index
); // 0-indexed
assert.strictEqual(res, verifyHash256Merkle[i].output);
// Now, attempt to use the same proof to verify the same leaf at
// a different index in the tree:
let pLen = verifyHash256Merkle[i].input.proof.length;
let height = ((pLen - 2) / 64) - 2;
// Only attempt to verify roots that are meant to be verified
if (verifyHash256Merkle[i].output && height >= 1) {
let altIdx = (2 ** height) + verifyHash256Merkle[i].input.index;
const resNext = await instance.verifyHash256Merkle(
verifyHash256Merkle[i].input.proof,
altIdx
);
assert.strictEqual(resNext, verifyHash256Merkle[i].output);
console.log('Verified transaction twice!');
}
}
});
Use the length of _proof
to determine the maximum allowed _index
. _index
should satisfy the following criterion: _index < 2 ** (_proof.length.div(32) - 2)
.
Note that subtraction by 2 accounts for the transaction hash and merkle root, which are assumed to be encoded in the proof along with the intermediate nodes.
This should be fairly easy
We need to build some intuition about how large txns can be without breaking SPV proofs.
For some reason we have a dedicated LE thing for every other digest, but not prevhash, which is BE only. Let's be consistent
I don't like that we rely on string (de)serialization in bitcoin spv. we should figure out alternative elegant ways of doing this
Add a line to prettifyOutput
that prints the address associated with the output payload.
$ basename $PWD
js
$ npm i
npm WARN @summa-tx/[email protected] No repository field.
npm WARN @summa-tx/[email protected] license should be a valid SPDX license expression
npm WARN optional SKIPPING OPTIONAL DEPENDENCY: [email protected] (node_modules/fsevents):
npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for [email protected]: wanted {"os":"darwin","arch":"any"} (current: {"os":"linux","arch":"x64"})
added 746 packages from 883 contributors and audited 885340 packages in 5.969s
found 5 vulnerabilities (1 moderate, 4 high)
run `npm audit fix` to fix them, or `npm audit` for details
Any idea what it actually means? Feels like person who cried wolf.
Golang slicing is not safe by default. It allows you to read past the written portion of a slice.
For example
a := []byte{1,2,3,4,5,6,7,8,9,10,11}
b := a[:3]
fmt.Println(hex.EncodeToString(b[0:6]))
> 010203040506
We should replace all direct slicing with a bounds-checked slice utility function.
I'm on v1.1.0 of the library. Using an example from the test files yields the correct output, like so:
truffle(development)> res = await instance.extractNumOutputs.call("0x010000000001027bb2b8f32b9ebf13af2b0a2f9dc03797c7b77ccddcac75d1216389abfa7ab3750000000000ffffffffaa15ec17524f1f7bd47ab7caa4c6652cb95eec4c58902984f9b4bcfee444567d0000000000ffffffff024db6000000000000160014455c0ea778752831d6fc25f6f8cf55dc49d335f040420f0000000000220020aedad4518f56379ef6f1f52f2e0fed64608006b3ccaff2253d847ddc90c9192202483045022100d9fb1c15fe691c06dace09305bdd7e3cd19ada9c9392ca3a8c0a6f22a61c2ef002206efd72d89b6c1680d4135de14887a774ad0d6ad81dcd15833c3dc30b90a5ca86012102d0ec63b4c9f3d9e8083a0216c22d675f6f9a5b0bf1931f09a690e7e8bb24f63402483045022100fc7bf8811762a0c25c65deed711304ffd81413a347b656f45e38e3be40ecfcb8022077a020fda57e57062f99e2c0b714d251a879664bdb6dffcb04642182645470ea0121039b3e8cd31336f9ce7733885cf6d64433df129ce4c274b089825bf1419d047a4300000000")
undefined
truffle(development)> res.toNumber()
2
But then this mainnet tx with witness won't work:
https://btc.bitaps.com/bbb47d5dfb78fd011ddf683160f48ba81755c4156200ba96ce6d4d6a0fd21583
truffle(development)> res = await instance.extractNumOutputs.call("0x02000000000101711ae730738826a48a93b0e3f400b331b879df4f6b351b8bef09a4b2f2272f3802000000171600145289ea94a4f777ce1ee19715eb850499e35eab06ffffffff0627e801000000000017a9140e9171e0175391d525d79b2aa96307c12e1f52bd873af30f00000000001976a9143a2292404db99d930b815a314a98c7551d8658d588acca220e000000000017a914104be1b96b281a7b4c61cea8321d49233d85bf3b87406bed070000000017a91416d1a08fee9746dfab4f36ce06e705ef69329df38771cd4f00000000001976a914aeb71ecebc2b5cc304c1a7cc880446d5ca1dd4ad88ac3cf5d8000000000017a91446fe7cb8fb1f5211225bea60e4ea99182bc3f9e3870247304402204178f9430398d8566d960dfd29e3bd4a5aeaf3d5a10c857bc8b774e2c9ace7fd0220628e31b73a05b3abd9ed1427bf8b90f72c9dad9c764cb3b3f577406ee7610b4f0121026ebc799795120e9b5791c77173e5fa78e79edbcb7a66cf13fa1494b25f95fe4600000000")
undefined
truffle(development)> res.toNumber()
137
And this testnet tx without witness does not work:
https://blockstream.info/testnet/tx/e9f07570343a6a49b541d4449028718c3d245a1a5d12439d3030a37dbd5026d2
truffle(development)> res = await instance.extractNumOutputs.call("0x0100000001253afcfa0813546f4bf7eb6a31fdf1ea23e36a255e5dd41f499b37b73a960db8000000006a47304402203e532ba8a5cb275b48765bc51a53eb64645a85382b1e4269db00ad30d09c7ba402207e3d5053e6c79028019529f33d3fd655d3ff54a919c9c1de74c87ae0b3315992012102392b10a28cc0fbc52fcfb2e9af7441839ca3014fefe30f95b0e6b14d56ef1da2feffffff02ed43291f500400001976a91410da3170f451f152ada3e4de2b2e457cbcc9e90a88ac807c814a0000000017a914fd9b7ec7c672afc42b98f7468e6ee0ca3e6f913c871da10700")
undefined
truffle(development)> res.toNumber()
126
I would try on a newer version but this function is dropped unfortunately. ๐ข
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.