Luke-Jr [ARCHIVE] on Nostr: 📅 Original date posted:2011-08-24 🗒️ Summary of this message: A discussion on ...
📅 Original date posted:2011-08-24
🗒️ Summary of this message: A discussion on Bitcoin's block size limit and difficulty adjustment, with suggestions for dynamic adaptation and increased precision, but concerns about potential issues.
📝 Original message:On Wednesday, August 24, 2011 12:46:42 PM Gregory Maxwell wrote:
> On Wed, Aug 24, 2011 at 12:15 PM, Luke-Jr <luke at dashjr.org> wrote:
> > - Replace hard limits (like 1 MB maximum block size) with something that
> > can dynamically adapt with the times. Maybe based on difficulty so it
> > can't be gamed?
>
> Too early for that.
Dynamically adapting would be by design never too early/late. Changing from a
fixed 1 MB will fork the block chain, which should be a minimized event.
> > - Adjust difficulty every block, without limits, based on a N-block
> > sliding window. I think this would solve the issue when the hashrate
> > drops overnight, but maybe also add a block time limit, or perhaps
> > include the "current block" in the difficulty calculation?
>
> The quantized scheme limits the amount of difficulty skew miners can
> create by lying about timestamps to about a half a percent. A rolling
> window with the same time constant would allow much more skew.
Depends on the implementation, I'd think.
> > Replacing the "Satoshi" 64-bit integers with
> > "Satoshi" variable-size fractions (ie, infinite numerator + denominator)
>
> Increasing precision I would agree with but, sadly, causing people to
> need more than 64 bit would create a lot of bugs.
>
> infinite numerator + denominator is absolutely completely and totally
> batshit insane. For one, it has weird consequences that the same value
> can have redundant encodings.
So? You can already have redundant transactions simply by changing the order
of inputs/outputs. A good client would minimize the transaction size by
reducing them, of course.
> Most importantly, it suffers factor inflation: If you spend inputs
> 1/977 1/983 1/991 1/997 the smallest denominator you can use for the
> output 948892238557.
I already tried to address this in my original mail. If I had those 4 coins, I
would use a denominator of 987 and discard the difference as fees.
🗒️ Summary of this message: A discussion on Bitcoin's block size limit and difficulty adjustment, with suggestions for dynamic adaptation and increased precision, but concerns about potential issues.
📝 Original message:On Wednesday, August 24, 2011 12:46:42 PM Gregory Maxwell wrote:
> On Wed, Aug 24, 2011 at 12:15 PM, Luke-Jr <luke at dashjr.org> wrote:
> > - Replace hard limits (like 1 MB maximum block size) with something that
> > can dynamically adapt with the times. Maybe based on difficulty so it
> > can't be gamed?
>
> Too early for that.
Dynamically adapting would be by design never too early/late. Changing from a
fixed 1 MB will fork the block chain, which should be a minimized event.
> > - Adjust difficulty every block, without limits, based on a N-block
> > sliding window. I think this would solve the issue when the hashrate
> > drops overnight, but maybe also add a block time limit, or perhaps
> > include the "current block" in the difficulty calculation?
>
> The quantized scheme limits the amount of difficulty skew miners can
> create by lying about timestamps to about a half a percent. A rolling
> window with the same time constant would allow much more skew.
Depends on the implementation, I'd think.
> > Replacing the "Satoshi" 64-bit integers with
> > "Satoshi" variable-size fractions (ie, infinite numerator + denominator)
>
> Increasing precision I would agree with but, sadly, causing people to
> need more than 64 bit would create a lot of bugs.
>
> infinite numerator + denominator is absolutely completely and totally
> batshit insane. For one, it has weird consequences that the same value
> can have redundant encodings.
So? You can already have redundant transactions simply by changing the order
of inputs/outputs. A good client would minimize the transaction size by
reducing them, of course.
> Most importantly, it suffers factor inflation: If you spend inputs
> 1/977 1/983 1/991 1/997 the smallest denominator you can use for the
> output 948892238557.
I already tried to address this in my original mail. If I had those 4 coins, I
would use a denominator of 987 and discard the difference as fees.