Pieter Wuille [ARCHIVE] on Nostr: 📅 Original date posted:2015-08-11 📝 Original message:On Tue, Aug 11, 2015 at ...
📅 Original date posted:2015-08-11
📝 Original message:On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev <
bitcoin-dev at lists.linuxfoundation.org> wrote:
> Hitting the limit in and of itself is not necessarily a bad thing. The
> question at hand is whether we should constrain that limit below what
> technology is capable of delivering. I'm arguing that not only we should
> not, but that we could not even if we wanted to, since competition will
> deliver capacity for global consensus whether it's in Bitcoin or in some
> other product / fork.
>
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with
higher centralization pressure of various forms. There is discussion about
whether these centralization pressures are significant, but citing that
it's artificially constrained under the limit is IMHO a misrepresentation.
It is constrained to aim for a certain balance between utility and risk,
and neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply switch
to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus
changes only when they are uncontroversial. Even when you believe a more
invasive change is worth it, others may disagree, and the risk from
disagreement is likely larger than the effect of a small block size
increase by itself: the risk that suddenly every transaction can be spent
twice (once on each side of the fork), the very thing that the block chain
was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability
should be an issue: if fees are being paid, it means someone is willing to
pay them. If people are doing transactions despite being unreliable, there
must be a use for them. That may mean that some use cases don't fit
anymore, but that is already the case.
--
Pieter
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150811/851e0acb/attachment.html>
📝 Original message:On Tue, Aug 11, 2015 at 9:37 PM, Michael Naber via bitcoin-dev <
bitcoin-dev at lists.linuxfoundation.org> wrote:
> Hitting the limit in and of itself is not necessarily a bad thing. The
> question at hand is whether we should constrain that limit below what
> technology is capable of delivering. I'm arguing that not only we should
> not, but that we could not even if we wanted to, since competition will
> deliver capacity for global consensus whether it's in Bitcoin or in some
> other product / fork.
>
The question is not what the technology can deliver. The question is what
price we're willing to pay for that. It is not a boolean "at this size,
things break, and below it, they work". A small constant factor increase
will unlikely break anything in the short term, but it will come with
higher centralization pressure of various forms. There is discussion about
whether these centralization pressures are significant, but citing that
it's artificially constrained under the limit is IMHO a misrepresentation.
It is constrained to aim for a certain balance between utility and risk,
and neither extreme is interesting, while possibly still "working".
Consensus rules are what keeps the system together. You can't simply switch
to new rules on your own, because the rest of the system will end up
ignoring you. These rules are there for a reason. You and I may agree about
whether the 21M limit is necessary, and disagree about whether we need a
block size limit, but we should be extremely careful with change. My
position as Bitcoin Core developer is that we should merge consensus
changes only when they are uncontroversial. Even when you believe a more
invasive change is worth it, others may disagree, and the risk from
disagreement is likely larger than the effect of a small block size
increase by itself: the risk that suddenly every transaction can be spent
twice (once on each side of the fork), the very thing that the block chain
was designed to prevent.
My personal opinion is that we should aim to do a block size increase for
the right reasons. I don't think fear of rising fees or unreliability
should be an issue: if fees are being paid, it means someone is willing to
pay them. If people are doing transactions despite being unreliable, there
must be a use for them. That may mean that some use cases don't fit
anymore, but that is already the case.
--
Pieter
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150811/851e0acb/attachment.html>