What is Nostr?
Dr Adam Back [ARCHIVE] /
npub1npq…jyfn
2023-06-07 15:38:48
in reply to nevent1q…jydw

Dr Adam Back [ARCHIVE] on Nostr: 📅 Original date posted:2015-06-19 📝 Original message:A lot of people think a ...

📅 Original date posted:2015-06-19
📝 Original message:A lot of people think a layer2 is needed, that has a higher
(algorithmic) scale in use of layer1 block-space but preserves
functionality and uplifts security from layer1. An example would be
lightning or similar.

But there are many things that could be done. Pure offchain is a weak
form of layer2. Its running today and maybe its handling 90-99% range
of all transactions right now (mostly in exchanges for example). This
layer can be incrementally hardened. It can also have standardised
APIs across vendors of custodians, and opt-in support of those APIs in
wallets. This would provide a convenience choice. Greenaddress also
for low-mid assurances solves the unconfirmed transactions. It's
probably not reasonable to expect bitcoin directly solve fast
unconfirmed transactions. Probably intermediate configurations in
complexity somewhere between greenaddress (2 of 2 + timelocked 1 sig)
and lightning may exist also. The internet doesnt stop at layer1.

(Which would then leave people who are uninterested in changing client
software to handle layer2, as "layer1 will always be enough die-hards"
(in the refusing the future and facing the O(n^2) scaling wall or
centralisation death with perplexing optimism :) Ok, not so
constructive but maybe a gentle reminder that it is not constructive
in the reverse direction either to throw around often false
characterisations. We're here now to improve bitcoin so lets do that.

What I said here seemed like it maybe subject to misinterpretation so
to clarify:

On 19 June 2015 at 11:22, Dr Adam Back <adam at cypherspace.org> wrote:
> For example it could hypothetically allow 10MB blocks on
> one chain and 100kB blocks on the main chain. People say complexity,
> scary. Sure I am talking longer term, but we have to also make
> concrete forward progress to the future or we'll be stuck here talking
> about perilously large constant changes in 5 years time!

I should clarify that I meant there I was assuming we do one increase
within the next 12 months frame that gives buffer for 5 years r&d to
improve things and build layer2.

But if we do no R&D on layer2, and insist that clients can never
change to become layer2 aware, and layer2 is too hard etc then our
risk would be we'd be back in the discussion of kicking the can afresh
again in some years with some even more centralising size change.

Sure we should make the transition and introduction to layer2 and an
intermediate crunch smoother, but "20MB now or else" isn't really
helping. It did help get the conversation revived, but at this point
its a hindrance. Seriously a big hindrance. No offence but please
find a way to gracefully stop and rejoin the constructive process.
You can disagree on factors and points and be collaborative others
disagree frequently and have done productive work cordially for years
under the BIP process.


About scaling again:

Here is what I said before in my TL;DR post about my thoughts on how
we would start on throughput short-term to have space to do layer2
development.

> I think almost everybody is on board with a combination plan:
>
> 1. work to improve decentralisation (specific technical work already
> underway, and education)
> 2. create a plan to increase block-size in a slow fashion to not cause
> system shocks (eg like Jeff is proposing or some better variant)
> 3. work on actual algorithmic scaling
>
> In this way we can have throughput needed for scalability and security
> work to continue.
>
> As I said you can not scale a O(n^2) broadcast network by changing
> constants, you need algorithmic improvements.
>
> People are working on them already. All of those 3 things are being
> actively worked on right now, and in the case of algorithmic scaling
> and improve decentralisation have been worked on for months.


Btw I wonder if Gavin or Mike would be willing to answer another
question I forgot from my TL;DR post which was:

- Did you accept payment from companies to lobby for 20MB blocks? Do
you consider that something appropriate to publicly disclose if so?
Do you consider that user rights should come above or below company
interests in Bitcoin?


FWIW on pondering that last question "should user rights come above or
below company interests" I think my view of the guiding principle is
starkly clear to me: that user rights should be the primary thing to
optimise for. Businesses are providing service to users, their
interests are secondary in so far as if they are enabled to provide
better service thats good.

Bitcoin is a user p2p currency, with a social contract and a strong
user ethos. Importing and forcing company interests would likely be
the start of a slippery slope towards an end to Bitcoin. If we allow
business rights to be paramount it seems likely that we will end back
at the status quo as bitcoin payment processors grow, conglomerate and
become paypal/bank like or actual banks and then their interests and
exposures are the same as the banks and they'll want to import their
business models into Bitcoin and erode the user ethos features that
are actually what gives Bitcoin any meaning and value in the majority.

That wont be good for the companies either, but they may not see that
until they've killed it, many companies operate on a1 or 2 year
time-horizon. They may say screw layer2, I have a runway and I need
micropayments to the wazoo and I dont have the dev resources for that.
Thats a conflict and the resolution isn't to override bitcoin's
meaning, but rather that they should do it at layer2 (eg changeTip
does this.. simple trustme layer2 which is OK given the amounts). The
world needs a neutral social contract enforcing layer1. Layer1 must
be neutral and free from policy and dispute resolution otherwise
dispute resolution costs are imported and you lose viral open
innovation growth vector the internet benefitted from. Jurisdiction
and regulation related things belong at the interfaces and at the
payment protocol layer in my view. (If thats not obvious to some
lurkers I elaborate on that argument amongst other things here:
https://www.youtube.com/watch?v=3dAdI3Gzodo )

Adam

ps the O(n^2) misunderstanding of varying assumptions was explored at
length on reddit
http://www.reddit.com/r/Bitcoin/comments/3a5f1v/mike_hearn_on_those_who_want_all_scaling_to_be/csboslb
if people are interested in that topic. I do not think O( t*n ) is a
useful metric because its predictive but only of the obvious and
internal, the useful predictive thing is resources vs users (for
nodes/users or whole-system).
Author Public Key
npub1npqsq0seh72pmhf6lvduhcn5kck90kt8qj3t2x5xqwx8atxuevgs3cjyfn