Will Madden [ARCHIVE] on Nostr: đź“… Original date posted:2015-08-20 đź“ť Original message:> > And if you see Bitcoin ...
đź“… Original date posted:2015-08-20
đź“ť Original message:>
> And if you see Bitcoin as a payment system where guaranteed time to confirmation is a feature, I fully agree. But I think that is an unrealistic dream. It only seems reliable because of lack of use. It costs 1.5 BTC per day to create enough transactions to fill the block chain at the minimum relay fee, and a small multiple of that at actual fee levels. Assuming that rate remains similar with an increased block size, that remains cheap.
Apologies, this is going to be long but please read it...
For starters, f the “minimum relay fee” is 0.0001 BTC, and the proven throughput from the recent stress tests was 2.3 trx/second, it’s 0.0001 x 2.3 x 60 x 60 x 24 or 19.872 BTC to fill up the block chain for 24 hours, not 1.5 BTC.
The math error isn’t important, because the premise of what you are saying is based on a misconception. No one is advocating that we should price fix transaction fees at some tiny amount to guarantee cheap transactions. Furthermore, it’s perfectly realistic to believe bitcoin can scale reliably without constraining the block size to 1MB, given certain constraints.
A quick disclosure, I run a small bitcoin startup that benefits from higher bitcoin prices and more people buying bitcoin with their fiat currency. I also have an inadvisably high percentage of my remaining personal savings denominated in bitcoin.
Back to the point, limiting block size to impose fee pressure is a well intentioned idea that is built on a misconception of how bitcoin's economics work. The price of bitcoin is based on the perceived value of units inside the protocol. Keeping transaction volumes through the bitcoin protocol capped at around 2.3 transactions / second limits the number of new people who can use bitcoin to around 100,000 people performing a little under 2 transactions daily. This is only a tiny bit more use than where we are presently. It’s forced stagnation.
Please, read and understand: constraining the network affect and adoption of bitcoin lowers its overall value, and by extension reduces its price as denominated in other currencies. The only alternatives presently to on blockchain transactions are centralized bank ledgers at exchanges or similar companies. Yes, while capping the bitcoin max_block_size to a level that restricts use will drive transaction fees higher, it will also reduce the underlying price of bitcoin as denominated in other currencies, because the outside observer will see bitcoin stagnating and failing to grow like a nascent but promising technology should. Higher fees combined with lower bitcoin price negates the value of the higher fees, and the only side effect is to stymie adoption in the process, while putting more focus on layer protocols that are no where near ready for mainstream, stable use!
Removing the cap entirely is also a catastrophically poor idea, because some group of jerks out there will absolutely make sure that every block is 32 MB, making it a real PITA for a hobbyist to get interested in bitcoin. Yes, some miners limit blocksize in order to balance propagation times against transaction fee revenue, so there is already a mechanism in place to push transaction fees higher by limiting size or not including transactions without fees that could offset a spam happy bad actor or group of actors, but we cannot leave that to chance. The damage is too high to allow the risk. Bitcoin is going to grow because each new curious, technically savvy kid who learns about it can download and participate as a full node. We’re not anywhere close to mainstream adoption or at a level of maturity where the protocol is fully baked, so we have an obligation to keep full nodes within the grasp of a starving college kid’s budget. That is the barometer here in my mind.
40% should be our best guess for keeping bitcoin in reach of hobbyists, and safer from more napsteresque node centralization. It's simple, which makes it less prone to failure and being picked apart politically as well. It may be too fast, or it may be too slow, but it’s probably a good guess for 5 years, and if history holds true, it will work for a long time and make the cost of running a node lower gradually as well. No one can predict the future, but this is the best we have. No one knows if it will be radio propagation of blocks, quantum spin liquid based storage or data transmission, or some other breakthrough that drives down costs, but something always seems to appear that keeps the long term trends intact. So why wouldn’t we use these trends?
8MB is about 40% annually from January 2009 to today. I can buy a 5TB external hard drive right now online for $130.00 in the US. The true block time is just over 9 minutes, so that’s 160 blocks a day x 8MB x 365.25 days a year, or around 467.52GB of new block size annually. This is 10.69 years of storage for $130.00, or a little over $12 a year - which is darn close to what the cost was back in late 2010 when I first learned about this stuff... I fail to see the “centralization" issue here, and when we contrast $12/year for hobbyists against the centralization risks of mining pools, we should all be ashamed to have wasted so much energy and time talking about this specific point. The math does not add up, and it’s not a significant centralization risk when we put an 8MB cap on this with 40% annual average growth. The energy we’ve blown on this topic should have been put into refining privacy, and solving mining pool centralization. There are so many more important problems.
Let's talk about other ideas for a moment. First, while lightning is really cool and I believe it will be an exponential magnifier for bitcoin someday, that day is NOT today. Waiting for layers over bitcoin to solve a self-imposed limit of 1mb is just a terrible, horrible idea for the health of the protocol. Lightning is really well thought through, but there are still problems that need to be solved. Forced channel expiration, transaction malleability are the theoretical issues that must be solved. There WILL be issues that aren’t anticipated and known today when it goes out “into the wild”. Protocols of this complexity do not go from white paper to stability in less than one to two years. Remember, even the bitcoin reference client had a catastrophic bug 1.5 years after its January 2009 launch in August 2010. I read here that the "deepest thinkers" believe we should wait for overlays to solve this bottleneck, well, bluntly, that is far from practical or pragmatic, and is “ivory tower” thinking. Discussing approaches like this are worse than agreeing to do nothing, because it drains our attention away from other more pressing issues that have a time limit on our ability to solve them before the protocol crystallizes from the scale of the network effect (like privacy, mining centralization, etc.) on top of accomplishing little of immediate value other than academic debates.
I’m not winning any popularity contests today… but it was a bad idea to approach this as we did with XT. We should have put in a solution that addressed just the cap size issue and nothing more. Other changes, pork, and changing the nature of the community management around the XT client is just too much political baggage to work without fracturing the support of the community. And guess what happened? We have the major community forum moderators actively censoring posts, banning users, and things are looking to the outside observer as if the entire system is starting to fall in on itself. Truth is ladies and gentlemen, our egos and economic interests are creating a tragedy of the commons that will hurt the lot of us far more than it will help the best off of us. Yeah, I get that no one wants to code in a hostile environment, and this community has definitely turned caustic and behaves like a mob of petulant children, but sometimes you have to suck it up and get things done.
So… what do we do? We should get our @#$@ together, stop the academic grand standing and ego driven debates, raise the cap to 8mb and permit an average growth of 40% a year, then get back to solving real problems and working on layer and side chain magnifiers. Allowing bitcoin to grow reasonably allows adoption to spread, the price to rise, which creates more demand, higher prices, and more fees. Again, because the fees and coinbase rewards are denominated in bitcoin, this increases the return to miners. This, combined with allowing for growth will encourage the price to rise, and increase stability for layers and side chains later on down the road when the technology is stable and mature.
For the love of whatever it is you care about, can we please just get this done? Do I have to go brush up on my C++ and start asking everyone obnoxious, amateur questions? Trust me, no one wants that. Let’s just get the cap raised to 8MB + 40% on average annualized and fight viciously about privacy or mining centralization. Something more important.
Thanks.
> On Aug 10, 2015, at 8:34 AM, Pieter Wuille via bitcoin-dev <bitcoin-dev at lists.linuxfoundation.org> wrote:
>
> On Mon, Aug 10, 2015 at 4:12 PM, Gavin Andresen <gavinandresen at gmail.com <mailto:gavinandresen at gmail.com>> wrote:
>
> Executive summary: when networks get over-saturated, they become unreliable. Unreliable is bad.
>
> Unreliable and expensive is extra bad, and that's where we're headed without an increase to the max block size.
>
> I think I see your point of view. You see demand for on-chain transactions as a single number that grows with adoption. Once the transaction creation rate grows close to the capacity, transactions will become unreliable, and you consider this a bad thing.
>
> And if you see Bitcoin as a payment system where guaranteed time to confirmation is a feature, I fully agree. But I think that is an unrealistic dream. It only seems reliable because of lack of use. It costs 1.5 BTC per day to create enough transactions to fill the block chain at the minimum relay fee, and a small multiple of that at actual fee levels. Assuming that rate remains similar with an increased block size, that remains cheap.
>
> If you want transactions to be cheap, it will also be cheap to make them unreliable.
>
> --
> Pieter
>
> _______________________________________________
> bitcoin-dev mailing list
> bitcoin-dev at lists.linuxfoundation.org
> https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150820/01e99a80/attachment-0001.html>
đź“ť Original message:>
> And if you see Bitcoin as a payment system where guaranteed time to confirmation is a feature, I fully agree. But I think that is an unrealistic dream. It only seems reliable because of lack of use. It costs 1.5 BTC per day to create enough transactions to fill the block chain at the minimum relay fee, and a small multiple of that at actual fee levels. Assuming that rate remains similar with an increased block size, that remains cheap.
Apologies, this is going to be long but please read it...
For starters, f the “minimum relay fee” is 0.0001 BTC, and the proven throughput from the recent stress tests was 2.3 trx/second, it’s 0.0001 x 2.3 x 60 x 60 x 24 or 19.872 BTC to fill up the block chain for 24 hours, not 1.5 BTC.
The math error isn’t important, because the premise of what you are saying is based on a misconception. No one is advocating that we should price fix transaction fees at some tiny amount to guarantee cheap transactions. Furthermore, it’s perfectly realistic to believe bitcoin can scale reliably without constraining the block size to 1MB, given certain constraints.
A quick disclosure, I run a small bitcoin startup that benefits from higher bitcoin prices and more people buying bitcoin with their fiat currency. I also have an inadvisably high percentage of my remaining personal savings denominated in bitcoin.
Back to the point, limiting block size to impose fee pressure is a well intentioned idea that is built on a misconception of how bitcoin's economics work. The price of bitcoin is based on the perceived value of units inside the protocol. Keeping transaction volumes through the bitcoin protocol capped at around 2.3 transactions / second limits the number of new people who can use bitcoin to around 100,000 people performing a little under 2 transactions daily. This is only a tiny bit more use than where we are presently. It’s forced stagnation.
Please, read and understand: constraining the network affect and adoption of bitcoin lowers its overall value, and by extension reduces its price as denominated in other currencies. The only alternatives presently to on blockchain transactions are centralized bank ledgers at exchanges or similar companies. Yes, while capping the bitcoin max_block_size to a level that restricts use will drive transaction fees higher, it will also reduce the underlying price of bitcoin as denominated in other currencies, because the outside observer will see bitcoin stagnating and failing to grow like a nascent but promising technology should. Higher fees combined with lower bitcoin price negates the value of the higher fees, and the only side effect is to stymie adoption in the process, while putting more focus on layer protocols that are no where near ready for mainstream, stable use!
Removing the cap entirely is also a catastrophically poor idea, because some group of jerks out there will absolutely make sure that every block is 32 MB, making it a real PITA for a hobbyist to get interested in bitcoin. Yes, some miners limit blocksize in order to balance propagation times against transaction fee revenue, so there is already a mechanism in place to push transaction fees higher by limiting size or not including transactions without fees that could offset a spam happy bad actor or group of actors, but we cannot leave that to chance. The damage is too high to allow the risk. Bitcoin is going to grow because each new curious, technically savvy kid who learns about it can download and participate as a full node. We’re not anywhere close to mainstream adoption or at a level of maturity where the protocol is fully baked, so we have an obligation to keep full nodes within the grasp of a starving college kid’s budget. That is the barometer here in my mind.
40% should be our best guess for keeping bitcoin in reach of hobbyists, and safer from more napsteresque node centralization. It's simple, which makes it less prone to failure and being picked apart politically as well. It may be too fast, or it may be too slow, but it’s probably a good guess for 5 years, and if history holds true, it will work for a long time and make the cost of running a node lower gradually as well. No one can predict the future, but this is the best we have. No one knows if it will be radio propagation of blocks, quantum spin liquid based storage or data transmission, or some other breakthrough that drives down costs, but something always seems to appear that keeps the long term trends intact. So why wouldn’t we use these trends?
8MB is about 40% annually from January 2009 to today. I can buy a 5TB external hard drive right now online for $130.00 in the US. The true block time is just over 9 minutes, so that’s 160 blocks a day x 8MB x 365.25 days a year, or around 467.52GB of new block size annually. This is 10.69 years of storage for $130.00, or a little over $12 a year - which is darn close to what the cost was back in late 2010 when I first learned about this stuff... I fail to see the “centralization" issue here, and when we contrast $12/year for hobbyists against the centralization risks of mining pools, we should all be ashamed to have wasted so much energy and time talking about this specific point. The math does not add up, and it’s not a significant centralization risk when we put an 8MB cap on this with 40% annual average growth. The energy we’ve blown on this topic should have been put into refining privacy, and solving mining pool centralization. There are so many more important problems.
Let's talk about other ideas for a moment. First, while lightning is really cool and I believe it will be an exponential magnifier for bitcoin someday, that day is NOT today. Waiting for layers over bitcoin to solve a self-imposed limit of 1mb is just a terrible, horrible idea for the health of the protocol. Lightning is really well thought through, but there are still problems that need to be solved. Forced channel expiration, transaction malleability are the theoretical issues that must be solved. There WILL be issues that aren’t anticipated and known today when it goes out “into the wild”. Protocols of this complexity do not go from white paper to stability in less than one to two years. Remember, even the bitcoin reference client had a catastrophic bug 1.5 years after its January 2009 launch in August 2010. I read here that the "deepest thinkers" believe we should wait for overlays to solve this bottleneck, well, bluntly, that is far from practical or pragmatic, and is “ivory tower” thinking. Discussing approaches like this are worse than agreeing to do nothing, because it drains our attention away from other more pressing issues that have a time limit on our ability to solve them before the protocol crystallizes from the scale of the network effect (like privacy, mining centralization, etc.) on top of accomplishing little of immediate value other than academic debates.
I’m not winning any popularity contests today… but it was a bad idea to approach this as we did with XT. We should have put in a solution that addressed just the cap size issue and nothing more. Other changes, pork, and changing the nature of the community management around the XT client is just too much political baggage to work without fracturing the support of the community. And guess what happened? We have the major community forum moderators actively censoring posts, banning users, and things are looking to the outside observer as if the entire system is starting to fall in on itself. Truth is ladies and gentlemen, our egos and economic interests are creating a tragedy of the commons that will hurt the lot of us far more than it will help the best off of us. Yeah, I get that no one wants to code in a hostile environment, and this community has definitely turned caustic and behaves like a mob of petulant children, but sometimes you have to suck it up and get things done.
So… what do we do? We should get our @#$@ together, stop the academic grand standing and ego driven debates, raise the cap to 8mb and permit an average growth of 40% a year, then get back to solving real problems and working on layer and side chain magnifiers. Allowing bitcoin to grow reasonably allows adoption to spread, the price to rise, which creates more demand, higher prices, and more fees. Again, because the fees and coinbase rewards are denominated in bitcoin, this increases the return to miners. This, combined with allowing for growth will encourage the price to rise, and increase stability for layers and side chains later on down the road when the technology is stable and mature.
For the love of whatever it is you care about, can we please just get this done? Do I have to go brush up on my C++ and start asking everyone obnoxious, amateur questions? Trust me, no one wants that. Let’s just get the cap raised to 8MB + 40% on average annualized and fight viciously about privacy or mining centralization. Something more important.
Thanks.
> On Aug 10, 2015, at 8:34 AM, Pieter Wuille via bitcoin-dev <bitcoin-dev at lists.linuxfoundation.org> wrote:
>
> On Mon, Aug 10, 2015 at 4:12 PM, Gavin Andresen <gavinandresen at gmail.com <mailto:gavinandresen at gmail.com>> wrote:
>
> Executive summary: when networks get over-saturated, they become unreliable. Unreliable is bad.
>
> Unreliable and expensive is extra bad, and that's where we're headed without an increase to the max block size.
>
> I think I see your point of view. You see demand for on-chain transactions as a single number that grows with adoption. Once the transaction creation rate grows close to the capacity, transactions will become unreliable, and you consider this a bad thing.
>
> And if you see Bitcoin as a payment system where guaranteed time to confirmation is a feature, I fully agree. But I think that is an unrealistic dream. It only seems reliable because of lack of use. It costs 1.5 BTC per day to create enough transactions to fill the block chain at the minimum relay fee, and a small multiple of that at actual fee levels. Assuming that rate remains similar with an increased block size, that remains cheap.
>
> If you want transactions to be cheap, it will also be cheap to make them unreliable.
>
> --
> Pieter
>
> _______________________________________________
> bitcoin-dev mailing list
> bitcoin-dev at lists.linuxfoundation.org
> https://lists.linuxfoundation.org/mailman/listinfo/bitcoin-dev
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linuxfoundation.org/pipermail/bitcoin-dev/attachments/20150820/01e99a80/attachment-0001.html>