Pages:
Author

Topic: Blocks are full. - page 16. (Read 14989 times)

legendary
Activity: 1008
Merit: 1001
In Cryptography We Trust
January 22, 2016, 01:11:25 AM
#45
I guess that we need to see frequent backlogs in 30 - 60 MB range for things to start changing.
Exactly. That is the only way that people are going to realise that we need change.

Yes. Plus bitcoin's price < $50 and Ethereum's capitalization taking over Bitcoin. We will have "consensus by fear".
sr. member
Activity: 242
Merit: 250
January 22, 2016, 01:01:10 AM
#44
I guess that we need to see frequent backlogs in 30 - 60 MB range for things to start changing.
Exactly. That is the only way that people are going to realise that we need change.
legendary
Activity: 1302
Merit: 1008
Core dev leaves me neg feedback #abuse #political
January 22, 2016, 12:39:05 AM
#43

Classic has no roadmap or anything. Even if we disregard the safety risk of 2 MB blocks right now, we would have this same discussion very soon (if the growth is going to increase). There's a proposal that aims to fix the validation time being quadratic; we should wait for that to be implemented.

Lauda, so in addition to the 'quadratic' risk (for which you admit there is a fix but core is not implementing),
you're also giving us the 'nothing is better than something' argument.

Really seems like you're shilling hard for core/blockstream.  Not that I
think they are paying you or anything.  You just have a huge bias
and seem to always support their position and actions.  That's
my OPINION and my impression.  Just saying.

legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 21, 2016, 06:51:38 PM
#42
You guys must be kidding me with this. Bitcoin works perfectly, I just send a transaction through my Core wallet, sent the recommended fee, and I didn't had any problems. Don't be cheap on the fees, and wait until 0.12 comes, and wait for SegWit, we can do this without raising the block size now. In the future maybe if needed, but right now, we have SegWit and we can resist until Lightning Network is operative. We must resist the pressures.

Guess you don't get all the threads and complaints about newbies and established bitcoiners whose fees again were not sufficient. And fees will only rise now.

1mb blocks enable easy spamming too so that makes things worse.

I can't see when a big part of the community became that conservative and fearfull that nothing is allowed to change anymore because "what if".
legendary
Activity: 1260
Merit: 1116
January 21, 2016, 06:44:38 PM
#41
It's the Fullblocalypse!

Panic! Stop! Run away!!










(Ok now come back.)
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 21, 2016, 06:44:31 PM
#40
I hear talking like the big guys we are here to defeat: this is not the FED.

If the economy goes wrong then let's give them some QE, then after a while QE2, then QE3 and so forth.

If we start raising the block size it will be like that in a way: tomorrow 2MB, in one week 4MB, you understand the trick.

This is not the $ and never it will be.

If it's needed to bring adoption forward then go ahead. Bitcoin will never succeed when adoption is hindered effectively. Sure there are potential problems, one has to deal with it like it happened all the time. And suddenly an arbitrarely setting becomes a breaking stone.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 21, 2016, 06:42:19 PM
#39
If we already know that the solution is a block increase I don't understand why it should be an increase to 2mb. If the increase is to be to 2mb in 2 years or less the community will be facing the same problem again.
So, in this sense, the increase should be to at least 8mb or even more.

I guess the current line of thinking is just to kick the can down the road. You do not implement something before it was thoroughly tested. I agree with this thinking in the sense that developers need to scale Bitcoin without risking the integrity of the whole network.

For example - " In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems. " - quoted from https://bitcoin.org/en/bitcoin-core/capacity-increases-faq#roadmap

You cannot just dump bigger block sizes into the protocol and hope nothing goes wrong, because it is a Billion dollar network. Let's just approach this with caution.  

Well nobody feared what will happen when blocks are not 0.1mb filled but 0.2mb. There was nobody screaming around that it is dangerous. And even with 2mb blocks, 2 months ago NOBODY claimed any danger coming from that.

Though what i read, the threat might only be of theoretical nature, similar to a >50% attack. Though i would need to know if it is possible to create such a block even when he not verifies, since then you would not need a miner. If you need a miner then no miner who can creates block would kill the chance for his own block reward. And if this risk can be created with a normal pc then it would be possible to simply flood the network with such blocks even with 1mb blocks. Mass would bring the danger then.

And it seems there is already a fix for that which could be implemented in bitcoin classic and make it safe.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 21, 2016, 06:38:07 PM
#38
Um, i think that was the old hardlimit for blocks. Some miners did not update that yet. It's no proof that all transactions were cleared at that point in time. It's fully up to the miner how many transactions he want to put into a block.
That's not the hard limit. The 'hard limit' is 1 MB, you're talking about a 750kb soft limit. It is interesting because Antpool was one of those that mined a 749kb block, yet the block that they have previously mined was 930kb.

I know it is the softlimit but i remember a discussion where it was said that the standard setting in nodes was sat to 750kb as the hard limit even though the blockchain allower 1mb blocks already. Not all nodes had changed that and people asked them to adjust that value in order to help the network. But it might be that still some miners use that old hardlimit.

I might remember wrongly though.

It would be convenient to check the last blocks and found one of these blocks with only one transaction, the block reward, and believing that it shows the network is greatly underloaded.
If I'm correct that is because of SPV mining.

You are correct with that. Only that it means that it is fully up to the miner how many transactions he put into a block. It can't be judged from the blocksize how many transactions could not be included.

But since you mention SPV mining. I believe the newest implementation was that, when a new block was found in the system, the miners start mining instantly with an empty block. in order to not lose hashingpower. Then it starts to check transactions and slowly put the transactions into the block, starting to hash on the block with the additional transactions.

That might be the reason why antpool creates such blocks? Dunno.
legendary
Activity: 1358
Merit: 1014
January 21, 2016, 09:57:23 AM
#37
You guys must be kidding me with this. Bitcoin works perfectly, I just send a transaction through my Core wallet, sent the recommended fee, and I didn't had any problems. Don't be cheap on the fees, and wait until 0.12 comes, and wait for SegWit, we can do this without raising the block size now. In the future maybe if needed, but right now, we have SegWit and we can resist until Lightning Network is operative. We must resist the pressures.
legendary
Activity: 994
Merit: 1035
January 21, 2016, 09:53:27 AM
#36

Classic has no roadmap or anything.

I believe this is deliberate as there isn't an actual capacity difference between the proposals as Classic represents another attempt at a change in governance with Toomin and Garzik in control(Gavin indicates he is just temporarily supporting and doesn't want to lead)

As stated here -
http://pastebin.com/B8YQr5TQ

The governance model hasn't been fleshed out but will involve consider-it that the toomin brothers control. There is a clear indication of multiple hard forks in the future to "Clean up" all of Cores code and keep increasing the blocksize.
legendary
Activity: 2674
Merit: 3000
Terminated.
January 21, 2016, 09:48:04 AM
#35
If we start raising the block size it will be like that in a way: tomorrow 2MB, in one week 4MB, you understand the trick.
Simple and straight to the point. The end result would be simply that users on slow networks will be unable to catch up.
In the meantime Bitcoin would most likely have to fork back quickly somewhere between 2 MB and 4 MB else the network would be left in an unrepairable state.

I guess the current line of thinking is just to kick the can down the road. You do not implement something before it was thoroughly tested. I agree with this thinking in the sense that developers need to scale Bitcoin without risking the integrity of the whole network.

You cannot just dump bigger block sizes into the protocol and hope nothing goes wrong, because it is a Billion dollar network. Let's just approach this with caution.   
Classic has no roadmap or anything. Even if we disregard the safety risk of 2 MB blocks right now, we would have this same discussion very soon (if the growth is going to increase). There's a proposal that aims to fix the validation time being quadratic; we should wait for that to be implemented.
sr. member
Activity: 689
Merit: 269
January 21, 2016, 04:47:38 AM
#34
If we start raising the block size it will be like that in a way: tomorrow 2MB, in one week 4MB, you understand the trick.


Simple and straight to the point. The end result would be simply that users on slow networks will be unable to catch up.
legendary
Activity: 2310
Merit: 1422
January 21, 2016, 03:24:26 AM
#33
I hear talking like the big guys we are here to defeat: this is not the FED.

If the economy goes wrong then let's give them some QE, then after a while QE2, then QE3 and so forth.

If we start raising the block size it will be like that in a way: tomorrow 2MB, in one week 4MB, you understand the trick.

This is not the $ and never it will be.
legendary
Activity: 3542
Merit: 1965
Leading Crypto Sports Betting & Casino Platform
January 21, 2016, 01:05:21 AM
#32
If we already know that the solution is a block increase I don't understand why it should be an increase to 2mb. If the increase is to be to 2mb in 2 years or less the community will be facing the same problem again.
So, in this sense, the increase should be to at least 8mb or even more.

I guess the current line of thinking is just to kick the can down the road. You do not implement something before it was thoroughly tested. I agree with this thinking in the sense that developers need to scale Bitcoin without risking the integrity of the whole network.

For example - " In 2MB blocks, a 2MB transaction can be constructed that may take over 10 minutes to validate which opens up dangerous denial-of-service attack vectors. Other lines of code would need to be changed to prevent these problems. " - quoted from https://bitcoin.org/en/bitcoin-core/capacity-increases-faq#roadmap

You cannot just dump bigger block sizes into the protocol and hope nothing goes wrong, because it is a Billion dollar network. Let's just approach this with caution.  
sr. member
Activity: 574
Merit: 251
January 20, 2016, 04:02:32 PM
#31
Fair enough. im not taking sides just so you know im just asking becuse im curious. do you have any article or thread i can read to help me grasp the whole thing about why blocksize increase is a bad thing. right now im not for or against. but it seems to be a subject that stirs up alot of feelings.
Well, I'd have to look for that information myself right now. The first thing that comes to mind is this:
Any examples of the 10 minute script
. The best thing that you could do is ask the developers yourself on IRC if you really want to know.

ok thanks for the link! i will check it out and will ask around search on the web after! Smiley
legendary
Activity: 2674
Merit: 3000
Terminated.
January 20, 2016, 03:28:01 PM
#30
Um, i think that was the old hardlimit for blocks. Some miners did not update that yet. It's no proof that all transactions were cleared at that point in time. It's fully up to the miner how many transactions he want to put into a block.
That's not the hard limit. The 'hard limit' is 1 MB, you're talking about a 750kb soft limit. It is interesting because Antpool was one of those that mined a 749kb block, yet the block that they have previously mined was 930kb.
It would be convenient to check the last blocks and found one of these blocks with only one transaction, the block reward, and believing that it shows the network is greatly underloaded.
If I'm correct that is because of SPV mining.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 20, 2016, 03:05:32 PM
#29
The correct statement is 'some blocks are full'. You can't just jump to conclusions based on data recovered in the last few hours. Out of the last few blocks I see a few at around 750kb.

Um, i think that was the old hardlimit for blocks. Some miners did not update that yet. It's no proof that all transactions were cleared at that point in time. It's fully up to the miner how many transactions he want to put into a block. It would be convenient to check the last blocks and found one of these blocks with only one transaction, the block reward, and believing that it shows the network is greatly underloaded.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 20, 2016, 02:46:05 PM
#28
It is frustrating how much energy is being wasted.  Just scale the blocksize  already...

it need consensus, dev don't want to release something that will not gather enough consensu and be useless basically

they are trying to see what the majority want before decide their move

Then probably the only way is something like bitcoin classic since till now there is no suggested way to find a consensus otherwise.
legendary
Activity: 2674
Merit: 1083
Legendary Escrow Service - Tip Jar in Profile
January 20, 2016, 02:44:20 PM
#27
We staying below the 1400 transactions per block and with the stress tests, we experienced 1483 and 1711 respectively, so we getting close to a average of 1400 transactions per

block, as a average. The thing that worries me, is the decline in the estimated transaction volumes. The only people smiling seems to be the miners, because the miners revenue is

averaging above 1 750 000 US.  Roll Eyes ..... Makes you wonder, why we seeing these high revenue, when transaction volumes are declining.

Forced higher fees. Many pay higher fees in order to ensure their transactions to go through. Wallets help with that by suggesting fees that will make the transactions go through.

That's the plan for bitcoin when it comes to 1mb fans.
hero member
Activity: 756
Merit: 500
January 20, 2016, 11:52:19 AM
#26
This is great, this means that Bitcoin is hot!  This also means that a solution can be found from all of this frustration.
Pages:
Jump to: