Pages:
Author

Topic: [Aug 2022] Mempool empty! Use this opportunity to Consolidate your small inputs! - page 3. (Read 85356 times)

legendary
Activity: 1512
Merit: 7340
Farewell, Leo
Also, increasing the size of the block makes this attack worse, than it currently is: https://bitcointalksearch.org/topic/m.1491085
That's the attack I was looking for but couldn't find. Thanks, garlonicon! As I understand it, this attack requires dedicating a little less than 1 MB of block space to execute. Currently, it is very expensive, but if the block size limit is 16 MB, a mining pool attacker could always fill a quarter of their block with these transactions. This would force the rest of the network to spend more than 12 minutes verifying them, effectively giving the attacker 12 minutes to mine alone.

Is that perhaps one of the best arguments in favor of a small block size? I see that it can practically only be resolved if we start implementing even more severe exclusions in the Script, but I'm not sure if it could be trivially resolved otherwise.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
Miners is another obstacle I completely overlooked. By rising the block size to 16 MB, you need to convince them that this will be more beneficial for their pockets, which is very debatable. At the moment, a simple wave of Ordinals can rise their block fees income by 100%.

You need to convince them that on-chain transaction volume will eventually increase by orders of magnitude compared to before, but none of us can responsibly make that promise. No one can be held accountable for the potential shortcomings.
In a way, it's a flaw of the way Bitcoin works: miners have a financial incentive to keep transaction fees high, even if that reduces Bitcoin's usability.
hero member
Activity: 811
Merit: 1962
Code:
The bandwidth might not be as prohibitive as you think.  A typical transaction
would be about 400 bytes (ECC is nicely compact).  Each transaction has to be
broadcast twice, so lets say 1KB per transaction.  Visa processed 37 billion
transactions in FY2008, or an average of 100 million transactions per day.
That many transactions would take 100GB of bandwidth, or the size of 12 DVD or
2 HD quality movies, or about $18 worth of bandwidth at current prices.
There is one problem with that approach: verification. Sending the whole chain is not a problem. But verifying still is. And what is the bottleneck of verification? For example CPU speed, which depends on frequency:

2011-09-13: Maximum Speed | AMD FX Processor Takes Guinness World Record
Quote
On August 31, an AMD FX processor achieved a Guiness World Record with a frequency of 8.429GHz, a stunning result for a modern, multi-core processor. The record was achieved with several days of preparation and an amazing and inspired run in front of world renowned technology press in Austin, Texas.

2022-12-21: First 9 GHz CPU (overclocked Intel 13900K)
Quote
It's over 9000. ElmorLabs KTH-USB: https://elmorlabs.com/product/elmorla... Validation: https://valid.x86.fr/t14i1f

Thank you to Asus and Intel for supporting the record attempt!

Intel Core i9-13900K
Asus ROG Maximus Z790 Apex
G.Skill Trident Z5 2x16GB
ElmorLabs KTH-USB Thermometer
ElmorLabs Volcano CPU container

See? Humans are still struggling with reaching 8-9 GHz, and you need a liquid nitrogen to maintain that value. And more than a decade ago, the situation was pretty much the same. So, the CPU speed is not "doubled" every year. Instead, you have just more and more cores, and you have for example 64-core processor, instead of having 2-core or 4-core.

Which means that yes, you can download 100 GB, maybe even more. But is the whole system really trustless, if you have no chance of verifying that data, and you have to trust, that all of them are correct? Imagine that you can download the whole chain very quickly, but it is not verified. What then?

Also note, that if something can be done in parallel, then yes, you can use 64-core processor, and execute 64 different things at the same time. However, many steps during validation are sequential. The whole chain is a sequence of blocks. The whole block is a sequence of transactions (and their order does matter, if one output is an input in another transaction in the same block). The hashing of legacy transactions is sequential (also in cases like bare multisig, which has O(n^2) complexity for no reason).

So yes, you can have 64-core processor with 4 GHz each, but a single core with 256 GHz would allow much more scaling. And this is one of the reasons, why we don't have bigger blocks. The progress in validation time is just not sufficient to increase it much further.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
Miners is another obstacle I completely overlooked. By rising the block size to 16 MB, you need to convince them that this will be more beneficial for their pockets, which is very debatable. At the moment, a simple wave of Ordinals can rise their block fees income by 100%.

You need to convince them that on-chain transaction volume will eventually increase by orders of magnitude compared to before, but none of us can responsibly make that promise. No one can be held accountable for the potential shortcomings.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
SHA-256 miners are in charge of the BTC network
Miners are in charge of creating new blocks, every user is in charge of which consensus rules they accept. Of course, without miners that means there are no new blocks.
It's kinda sad the "one computer, one vote" thing can't work.
sr. member
Activity: 1666
Merit: 308
For the sake of the discussion, let's assume that the 16 MB limit is a harmless one. How do we enforce it in a softfork way? Segwit was enforced in a clever way, by separating the witness data from the transaction data. AFAIK, it's impossible to achieve it in softfork, unless you've figured out of another way to restructure the transaction data.
I don't think a softfork can do this, but then again, when the limit was lowered, it must have been a hardfork too. Except for back then there wasn't much controversy about it.
Are you sure about that?

Because even serious bugs early on were fixed with a soft fork.

Quote
"Why shouldn't this be done in a hardfork way?". It's already history, called "Bitcoin Cash". There is no point in redoing the same thing.
That was 7 years ago, surrounded by loads of contoversy, and promoted by some people with their own agenda. I don't think it's right to use that failure to keep blocks the same size for eternity. I don't think a proper change, coming from the Bitcoin Core devs, that actually improves Bitcoin's future will be rejected.
But this isn't ETH where Vitalik's team controls the network... SHA-256 miners are in charge of the BTC network, so good luck trying to convince them.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
I don't think a softfork can do this, but then again, when the limit was lowered, it must have been a hardfork too. Except for back then there wasn't much controversy about it.
That was a softfork. When you invalidate a currently valid rule, it's a softfork, and in this case, you invalidate blocks with size > 1 MB. However, a few years after this, a hardfork emerged.

Any decision taken by Satoshi was unquestionably followed, because... it was Satoshi. It's good that we don't have that source of influence anymore.

That was 7 years ago, surrounded by loads of contoversy, and promoted by some people with their own agenda.
It still is surrounded by loads of controversy. First things first, what's the ideal block size increase? We're agreeing on the 16 MB limit, but stompix wants it to double on every halving. Another one might think 16 MB is still very small. I can already predict that the hardfork will end up just like Bitcoin Cash; uncertainty on the ideal adjustment will encourage people to stick with the tested, conservative 4 MB.

I don't think a proper change, coming from the Bitcoin Core devs, that actually improves Bitcoin's future will be rejected.
I want to remind you, at this point, that before Bitcoin Cash, there were Bitcoin developers who supported big blocks. The fact that they split a long time ago highlights how unlikely it is for the current small block developers to suddenly change their minds.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
you still need one Bitcoin on-chain TX in order to send your Bitcoin to exchange/other service, to obtain the "wrapped" coin.
Isn't it more likely they use dollars or other coins for that? I don't expect real Bitcoin owners to exchange it for a made-up centralized token, but especially Binance tries really hard to make people withdraw their own made-up tokens instead of Bitcoin.
legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
In addition, other scaling option (e.g. using LN or sidechain) needs on-chain TX to open LN channel or "peg" the coin on the sidechain. So very high TX fee would make those option not very attractive.
I've never used any sidechain, and it seems like "wrapped" centralized tokens are much more popular than actual sidechains. Replacing central banks by businesses is not what I hoped for.

Yeah, "wrapped" coin is definitely more popular. But either way, you still need one Bitcoin on-chain TX in order to send your Bitcoin to exchange/other service, to obtain the "wrapped" coin.

For the sake of the discussion, let's assume that the 16 MB limit is a harmless one. How do we enforce it in a softfork way? Segwit was enforced in a clever way, by separating the witness data from the transaction data. AFAIK, it's impossible to achieve it in softfork, unless you've figured out of another way to restructure the transaction data.

Most straightforward option would be increasing witness discount from 4 to 16. But without making Ordinal or other TX which use OP_FALSE OP_IF ... OP_ENDIF non-standard, it'll just make cost to spam on Bitcoin blockchain even cheaper.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
For the sake of the discussion, let's assume that the 16 MB limit is a harmless one. How do we enforce it in a softfork way? Segwit was enforced in a clever way, by separating the witness data from the transaction data. AFAIK, it's impossible to achieve it in softfork, unless you've figured out of another way to restructure the transaction data.
I don't think a softfork can do this, but then again, when the limit was lowered, it must have been a hardfork too. Except for back then there wasn't much controversy about it.

Quote
"Why shouldn't this be done in a hardfork way?". It's already history, called "Bitcoin Cash". There is no point in redoing the same thing.
That was 7 years ago, surrounded by loads of contoversy, and promoted by some people with their own agenda. I don't think it's right to use that failure to keep blocks the same size for eternity. I don't think a proper change, coming from the Bitcoin Core devs, that actually improves Bitcoin's future will be rejected.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
When syncing Bitcoin Core (on not very recent hardware), I already see multiple blocks per second being verified. I recently did it on a 5 years old Xeon, and the IBD took 11 hours. That's 15 MB block data verification per second on average.
My bad. I must have counted the time it takes for a Raspberry Pi to do it, and I ignored batch verification, or other techniques Bitcoin Core implements to increase efficiency. I just multiplied the average total transactions times the time it takes to verify just one ECDSA signature. However, I do believe certain transaction types take more time to verify than typical, which can be used as an attack vector. I'll get back to it when I find the data to back this claim.

For the sake of the discussion, let's assume that the 16 MB limit is a harmless one. How do we enforce it in a softfork way? Segwit was enforced in a clever way, by separating the witness data from the transaction data. AFAIK, it's impossible to achieve it in softfork, unless you've figured out of another way to restructure the transaction data.

"Why shouldn't this be done in a hardfork way?". It's already history, called "Bitcoin Cash". There is no point in redoing the same thing.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
In addition, other scaling option (e.g. using LN or sidechain) needs on-chain TX to open LN channel or "peg" the coin on the sidechain. So very high TX fee would make those option not very attractive.
I've never used any sidechain, and it seems like "wrapped" centralized tokens are much more popular than actual sidechains. Replacing central banks by businesses is not what I hoped for.
legendary
Activity: 2870
Merit: 7490
Crypto Swap Exchange
Yes, I know, it may sound a bit harsh. On the other hand we start getting used to use on-chain transactions only when it's meaningful (hence worth paying even 50$+ for it). For the rest, for small amounts like for example the signature campaigns or paying for VPN, sorry, but LN, no matter how imperfect it is, is the solution we should really consider. At least until the proper solution is discovered and implemented.
Why is LN a solution? What if everyone moves on LN? Will miners still profit?

As reminder, you need 2 Bitcoin on-chain TX when you use LN in order to open and close LN channel. That's where miner earn some income.

I haven't argued that rising the block size to 10-16 MB will eliminate Bitcoin. I'm just saying that it doesn't solve the problem, it only alleviates it.
I think we can agree on this, but with one difference: I'd like to see the "breathing space" in more Bitcoin transactions, while you don't want it. I'd like to see more on-chain transactions until a more permanent scaling solution is in place.

In addition, other scaling option (e.g. using LN or sidechain) needs on-chain TX to open LN channel or "peg" the coin on the sidechain. So very high TX fee would make those option not very attractive.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
I haven't argued that rising the block size to 10-16 MB will eliminate Bitcoin. I'm just saying that it doesn't solve the problem, it only alleviates it.
I think we can agree on this, but with one difference: I'd like to see the "breathing space" in more Bitcoin transactions, while you don't want it. I'd like to see more on-chain transactions until a more permanent scaling solution is in place.

Quote
With block size 1 MB, it takes around 2 seconds to verify the block.
When syncing Bitcoin Core (on not very recent hardware), I already see multiple blocks per second being verified. I recently did it on a 5 years old Xeon, and the IBD took 11 hours. That's 15 MB block data verification per second on average.
hero member
Activity: 882
Merit: 792
Watch Bitcoin Documentary - https://t.ly/v0Nim
What about the halving after that? Or the second after that? In less than 20 years from now, the block subsidy will be less than 0.1 bitcoin. Block space has to be valuable.
Does it matter if every person from 10 people pay 100 sat/vByte as a fee or every person from 100 people pay 10 sat/vByte as a fee? It doesn't matter because the outcome is the same. Bitcoin can't grow with such a low block size, it will simply make it unattractive and people will actively start to look for alternatives.
Simply, there is no way that by increasing the block size, miners will lose the profit, no, that's not true.

Yes, I know, it may sound a bit harsh. On the other hand we start getting used to use on-chain transactions only when it's meaningful (hence worth paying even 50$+ for it). For the rest, for small amounts like for example the signature campaigns or paying for VPN, sorry, but LN, no matter how imperfect it is, is the solution we should really consider. At least until the proper solution is discovered and implemented.
Why is LN a solution? What if everyone moves on LN? Will miners still profit?

Bitcoin is not meant for buying coffee or conducting other low-value transactions, at least not on-chain. It represents the best monetary standard we can have, and using it for such purposes undervalues its true potential.
I can't agree with you, Bitcoin is meant for P2P transactions and that includes everything, starting from Coffee to Car and so on. The simple fact is that Bitcoin wasn't meant for massive usage and it's blocks size was 1 MB because it was enough for 2010 and for a few upcoming years. As time goes, number of Bitcoin users increase and the technology advances, so it's perfectly okay and to my mind, even necessary, to increase the block size.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
That's the simplest explanation I can think of, because there are even more elaborate ones (like perhaps he had bipolar disorder). There are lots of inexplicable contradictions in those emails.
The forum user was a big blocker, while the coder was a small blocker.  Tongue
sr. member
Activity: 1666
Merit: 308
What's "Bitcoin"? I remember blocks being temporarily limited from 30 to 1 MB by Satoshi in the early days, to reduce spam and blockchain growth.
Satoshi also envisioned the blockchain to grow by 100 GB per day, because that'd equate to "just the size of 12 DVD", completely ignoring the bottleneck in verification. He believed that the blockchain should be accessible only by server farms that generate new coins, everybody else should opt-in to SPV. It's clear to me today that Satoshi wasn't the best figure to see as the leading expert.


HmmMAA will scold you for doubting Satoshi! Grin

On a serious note, after reading all Satoshi's emails that leaked a while ago, I came to the conclusion that it was most likely a band of people with varying interests/agendas.

That's the simplest explanation I can think of, because there are even more elaborate ones (like perhaps he had bipolar disorder). There are lots of inexplicable contradictions in those emails.
legendary
Activity: 1512
Merit: 7340
Farewell, Leo
But let's go back a bit in time, cause you claimed people only care about BTC, and check historical data
And where exactly did I claim that people care only about Bitcoin?

And what do we have here, every time Bitcoin fees rise, Doge alone outperforms Bitcoin.
Alright, I checked the data, and it is true that Dogecoin outperforms every cryptocurrency for some reason, sometimes. And some other times, its total transactions fall down rapidly. I can't explain this, because it isn't happening always when the Bitcoin fees rise. For example, there weren't high fees in the summer of 2023, but there were 2M Dogecoin transactions happening everyday.

Your point is still unclear to me. I haven't argued that rising the block size to 10-16 MB will eliminate Bitcoin. I'm just saying that it doesn't solve the problem, it only alleviates it. It will still be expensive to use Bitcoin for everyday payments, centralized solutions will still be more appealing.

First who said anything about 700MB size
Satoshi.

Wonder how doge managed to get its full 1MB blocks every minute, they must have some sort of much wow chips running their nodes.
With block size 1 MB, it takes around 2 seconds to verify the block. With 1 minute interval, I'd expect a lot of blocks to be stale, by the way.

But the whole thing about nodes is pure cring materials, so we need 50k nodes with 10TB drives and the last model CPU to keep the network decentralized, which is impossible but in the meantime...
You have 6 million!!! 3KW monsters ASICs worth 2k-3k mining, and that somehow is decentralization?
Can you use an ASIC to verify an ECDSA signature? You can't, and AFAIK, there is no faster way to do it than with a modern CPU.

Oh no, users won't be able to host their nodes
You probably missed it, but I was referring solely to between large mining servers. Satoshi was talking about large mining farms having exclusive access to the blockchain; everyone else should be using SPV.
legendary
Activity: 2828
Merit: 6108
Blackjack.fun
    Let's look at the closest evidence we have to verify this claim: altcoins.

    - Bitcoin Cash is the Bitcoin fork I'd use if I was forbidden from using Bitcoin. Total transactions in the last 24 hours: 14,546. Block size limit: 32 MB, block interval: 10 minutes.
    - Total Litecoin transactions in the last 24 hours: 181,694. Block size limit: 1 MB, every 2.5 minutes. (in 10 minutes: 4 MB)
    - Total Dogecoin transactions in the last 24 hours: 62,270. Block size limit: 1 MB, once a minute. (in 10 minutes: 10 MB)

    Total Bitcoin transactions in the last 24 hours: 862,464. If high on-chain capacity is what the world wants, then we should expect them to have migrated, and do more on-chain transactions there. It's clearly not the case, though.

    When you cherry-pick data be careful for that data not to come and bite right back.
    The 860k Bitcoin transaction was a fluke! A big one caused by an insane amount of blocks in the last 24 hours, it was 172 on the 25 as you can see we're down to 595k on normal block time!

    But let's go back a bit in time, cause you claimed people only care about BTC, and check historical data:


    And what do we have here, every time Bitcoin fees rise, Doge alone outperforms Bitcoin.
    Proven fact, not opinion, proven fact that people care about fees!

    One mining pool would gain enormous advantage. 100 GB everyday is 700 MB every 10 minutes. Verifying an ECDSA signature takes about 1 millisecond on modern CPU. With just 4 MB block size, there can be about 7000 transactions every 10 minutes. If each has just one ECDSA signature, it takes 7 seconds to verify the block. With 700 MB block size, it'd take 175x as much time; 1225 seconds, which is 20 minutes. The mining pool with the most hashrate would render the rest unprofitable and gain all the hashrate.[/li][/list]

    Oh, the horror.
    First who said anything about 700MB size and more like at most 16MB, so diving that by 30 you got half a minute and let me tell you from he start the math ain't right on that time either.
    Wonder how doge managed to get its full 1MB blocks every minute, they must have some sort of much wow chips running their nodes.

    But the whole thing about nodes is pure cring materials, so we need 50k nodes with 10TB drives and the last model CPU to keep the network decentralized, which is impossible but in the meantime...
    You have 6 million!!! 3KW monsters ASICs worth 2k-3k mining, and that somehow is decentralization?

    Oh no, users won't be able to host their nodes, were doomed, meanwhile, the latest miners from whatsminer can't even be plugged in your house if you're on a 20A breaker, not to mention it's $8k a piece and to have 50% of it you need a million!
    But somebody please think of the nodesssss!

    Who's going to open a LN channel if it costs $50? And another $50 or more to close the channel?

    Nobody!

    There is a more problematic math at work here, every time Bitcoin grows the amount guarded grows, with Bitcoin at 1 million there is 19 trillion in wealth there
    The only "worry" is for the amount being transfered. That's the part that can be replaced in a 51% attack. There's no moment all 19 trillion is at risk at the same time.

    The moment your coins that you have bought for two hours suddenly return to the buyer cause somebody has rewritten the chain with empty blocks for the last 4 hours I have a doubt the price will stay the same. But of course, we could go back in time when checks were used and only think a transaction is secure after 6000 not 6 blocks. It's amazing how everything else around gets faster and cheaper, but Bitcoin which was supposed to be revolutionary is somehow stuck in the last decade.


    Oh, LE:
    https://www.theblock.co/data/on-chain-metrics/bitcoin/runes-transactions
    598K Transactions, Rune tx: 382k, non-runes: 214k
    Litecoins Transactions last 24h (Number of transactions in blockchain per day)   253,796

     Grin Grin Grin Grin
    legendary
    Activity: 1512
    Merit: 7340
    Farewell, Leo
    That doesn't sound so bad at all
    This would be extremely bad. I don't know where to start.

    • One mining pool would gain enormous advantage. 100 GB everyday is 700 MB every 10 minutes. Verifying an ECDSA signature takes about 1 millisecond on modern CPU. With just 4 MB block size, there can be about 7000 transactions every 10 minutes. If each has just one ECDSA signature, it takes 7 seconds to verify the block. With 700 MB block size, it'd take 175x as much time; 1225 seconds, which is 20 minutes. The mining pool with the most hashrate would render the rest unprofitable and gain all the hashrate.
    • Spam is taken to the next level. People can now embed entire movies on-chain, which can introduce legal problems, and with a few large mining farms owning all the nodes, it'd be much easier to shut down the network.
    • There is no fee market competition. All transactions paying 1 satoshi have high-priority, until there are more than 800k transactions unconfirmed within 10 minutes. This renders the system unsustainable over the long-term.

    Let's be realistic: there are 18,500 full nodes
    Being pedantic here, but there are more than 50,000. The 18,500 are listening nodes.

    If I have to choose between 1) a user keeping their coins on an exchange or using custodial LN and 2) a user keeping their Bitcoin in a SPV wallet that relies on several large server farms, the latter sounds more decentralized.
    I see what you did there, but here's the difference: In the former, you have the option to use Bitcoin with no trusted third parties. In the latter, you may have self-custody, but you're forced to trust third parties (or realistically, practically forced).

    I believe Bitcoin should be accessible with no trust required, and thus, I stand with the former.

    I think a lot of Bitcoin users expect Bitcoin to scale at some point.
    People expect Bitcoin to scale through second layers. I don't think there are a lot of people aware of Bitcoin's history, believing it'll solve the problem by tinkering with the block size.
    Pages:
    Jump to: