Luke-Jr [ARCHIVE] on Nostr: 📅 Original date posted:2011-08-24 🗒️ Summary of this message: Gavin Andresen ...
📅 Original date posted:2011-08-24
🗒️ Summary of this message: Gavin Andresen suggests implementing or enabling opcodes to enable smaller bitcoin addresses and scheduling a blockchain split to fix various issues.
📝 Original message:On Wednesday, August 24, 2011 11:12:10 AM Gavin Andresen wrote:
> So, if we are going to have new releases that are incompatible with
> old clients why not do things right in the first place, implement or
> enable opcodes so the new bitcoin addresses can be small, and schedule
> a block chain split for N months from now.
If a block chain split is to occur, it makes sense to try to fix as many
problems as possible:
- Replace hard limits (like 1 MB maximum block size) with something that can
dynamically adapt with the times. Maybe based on difficulty so it can't be
gamed?
- Adjust difficulty every block, without limits, based on a N-block sliding
window. I think this would solve the issue when the hashrate drops
overnight, but maybe also add a block time limit, or perhaps include the
"current block" in the difficulty calculation?
- 21 million really isn't enough if Bitcoin ever takes off, even with
100,000,000 units per BTC. Replacing the "Satoshi" 64-bit integers with
"Satoshi" variable-size fractions (ie, infinite numerator + denominator)
would create infinite possibilities of future divison, allowing people to
not only do nBTC and pBTC, but also exact 1/3 of any quantity. Transaction
size would go up based on the number of primes involved in an amount, which
would encourage discarding annoying primes in transaction fees.
- Standardize everything on network (big) endian.
I'm sure others can think of other chain-splitting fixes that wouldn't be too
much work to fix.
🗒️ Summary of this message: Gavin Andresen suggests implementing or enabling opcodes to enable smaller bitcoin addresses and scheduling a blockchain split to fix various issues.
📝 Original message:On Wednesday, August 24, 2011 11:12:10 AM Gavin Andresen wrote:
> So, if we are going to have new releases that are incompatible with
> old clients why not do things right in the first place, implement or
> enable opcodes so the new bitcoin addresses can be small, and schedule
> a block chain split for N months from now.
If a block chain split is to occur, it makes sense to try to fix as many
problems as possible:
- Replace hard limits (like 1 MB maximum block size) with something that can
dynamically adapt with the times. Maybe based on difficulty so it can't be
gamed?
- Adjust difficulty every block, without limits, based on a N-block sliding
window. I think this would solve the issue when the hashrate drops
overnight, but maybe also add a block time limit, or perhaps include the
"current block" in the difficulty calculation?
- 21 million really isn't enough if Bitcoin ever takes off, even with
100,000,000 units per BTC. Replacing the "Satoshi" 64-bit integers with
"Satoshi" variable-size fractions (ie, infinite numerator + denominator)
would create infinite possibilities of future divison, allowing people to
not only do nBTC and pBTC, but also exact 1/3 of any quantity. Transaction
size would go up based on the number of primes involved in an amount, which
would encourage discarding annoying primes in transaction fees.
- Standardize everything on network (big) endian.
I'm sure others can think of other chain-splitting fixes that wouldn't be too
much work to fix.