Chris Double [ARCHIVE] on Nostr: đ Original date posted:2011-09-03 đď¸ Summary of this message: The Bitcoin ...
đ
Original date posted:2011-09-03
đď¸ Summary of this message: The Bitcoin code may be using the wrong size calculation for transaction priority due to casting shenanigans in CWallet::CreateTransaction.
đ Original message:In CWallet::CreateTransaction there is a call to 'GetSerializeSize' on
line 979 (https://github.com/bitcoin/bitcoin/blob/master/src/wallet.cpp#L979).
It looks like:
---------8<----------
unsigned int nBytes = ::GetSerializeSize(*(CTransaction*)&wtxNew, SER_NETWORK)
if (nBytes >= MAX_BLOCK_SIZE_GEN/5)
return false;
dPriority /= nBytes;
---------8<----------
'wtxNew' is a CWalletTxn. So this gets the serialized size of the
transaction, including all the extra data held in the wallet for that
transaction, and uses that size for computation of priority. Is that
correct? Should it be only the size of the CTransaction* part of the
transaction that should be used?
It looks this was from the casting shenanigans but unless I'm
mistaken, that casting doesn't actually do anything. We get a pointer
to a CTransaction but then dereference it, so the template function
'GetSerializeSize' would get the most derived class, right?
So was the intent to use the CWalletTxn size, and the casting is
superfluous, or was it supposed to be the CTransaction* portion and
the cast is an incorrect way of doing that? Or am I suffering from
late night programmer syndrome and reading it wrong?
Chris.
--
http://www.bluishcoder.co.nz
đď¸ Summary of this message: The Bitcoin code may be using the wrong size calculation for transaction priority due to casting shenanigans in CWallet::CreateTransaction.
đ Original message:In CWallet::CreateTransaction there is a call to 'GetSerializeSize' on
line 979 (https://github.com/bitcoin/bitcoin/blob/master/src/wallet.cpp#L979).
It looks like:
---------8<----------
unsigned int nBytes = ::GetSerializeSize(*(CTransaction*)&wtxNew, SER_NETWORK)
if (nBytes >= MAX_BLOCK_SIZE_GEN/5)
return false;
dPriority /= nBytes;
---------8<----------
'wtxNew' is a CWalletTxn. So this gets the serialized size of the
transaction, including all the extra data held in the wallet for that
transaction, and uses that size for computation of priority. Is that
correct? Should it be only the size of the CTransaction* part of the
transaction that should be used?
It looks this was from the casting shenanigans but unless I'm
mistaken, that casting doesn't actually do anything. We get a pointer
to a CTransaction but then dereference it, so the template function
'GetSerializeSize' would get the most derived class, right?
So was the intent to use the CWalletTxn size, and the casting is
superfluous, or was it supposed to be the CTransaction* portion and
the cast is an incorrect way of doing that? Or am I suffering from
late night programmer syndrome and reading it wrong?
Chris.
--
http://www.bluishcoder.co.nz