What is Nostr?
Pieter Wuille [ARCHIVE] /
npub1tje…tl6r
2023-06-07 15:44:17
in reply to nevent1q…68ak

Pieter Wuille [ARCHIVE] on Nostr: 📅 Original date posted:2015-07-30 📝 Original message:On Jul 30, 2015 6:20 PM, ...

📅 Original date posted:2015-07-30
📝 Original message:On Jul 30, 2015 6:20 PM, "Gavin Andresen" <gavinandresen at gmail.com> wrote:
>
> On Thu, Jul 30, 2015 at 10:25 AM, Pieter Wuille via bitcoin-dev <
bitcoin-dev at lists.linuxfoundation.org> wrote:
>>
>> Some things are not included yet, such as a testnet whose size runs
ahead of the main chain, and the inclusion of Gavin's more accurate sigop
checking after the hard fork.
>>
>> Comments?
>
>
> First, THANK YOU for making a concrete proposal!

You're welcome.

> Specific comments:
>
> So we'd get to 2MB blocks in the year 2021. I think that is much too
conservative, and the most likely effect of being that conservative is that
the main blockchain becomes a settlement network, affordable only for
large-value transactions.

If there is demand for high-value settlements in Bitcoin, and this results
in a functioning economy where fees support the security of a transparent
network, I think that would be a much better outcome than what we have now.

> I don't think your proposal strikes the right balance between
centralization of payments (a future where only people running payment
hubs, big merchants, exchanges, and wallet providers settle on the
blockchain) and centralization of mining.

Well, centralization of mining is already terrible. I see no reason why we
should encourage making it worse. On the other hand, sure, not every
transaction fits on the blockchain. That is already the case, and will
remain the case with 2 MB or 8 MB or 100 MB blocks. Some use cases fit, and
others won't. We need to accept that, and all previous proposals I've seen
don't seem to do that.

Maybe the only ones that will fit are large settlements between layer-2
services, and maybe it will be something entirely different. But at least
we don't need to compromise the security of the main layer, or promise the
ecosystem unrealistic growth of space for on-chain transactions.

If Bitcoin needs to support a large scale, it already failed.

> I'll comment on using median time generally in Jorge's thread, but why
does monotonically increasing matter for max block size? I can't think of a
reason why a max block size of X bytes in block N followed by a max size of
X-something bytes in block N+1 would cause any problems.

I don't think it matters much, but it offers slightly easier transition for
the mempool (something Jorge convinced me of), and validation can benefit
from a single variable that can be set in a chain to indicate a switchover
happened, without needing to recalculate it every time.

I assume you're asking this because using raw nTime means you can check
what limits a p2p message should obey to by looking at just the first
bytes. I don't think this matters: your p2p protocol should deal with
whatever limits the system requires anyway. An attacker can take a valid
message of far in the chain, change a few bytes, and spam you with it at
zero extra effort anyway.

--
Pieter
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150730/fb2a6d86/attachment-0001.html>;
Author Public Key
npub1tjephawh7fdf6358jufuh5eyxwauzrjqa7qn50pglee4tayc2ntqcjtl6r