Pages:
Author

Topic: 1mb is too big (Read 3320 times)

legendary
Activity: 3150
Merit: 1392
Join the world-leading crypto sportsbook NOW!
October 17, 2016, 03:09:12 PM
#79


lulz



Sort of funny but if we interpret 1MB as one million of Bitcoins and think about it seriously then it really is a very big amount of money. Even now when BTC is about 600$, one million would be 600 000 000 $ which is unbelievably big sum. So the dog's not retarted and is right that people need to stop at some point when they try to earn money.
legendary
Activity: 2674
Merit: 2965
Terminated.
October 17, 2016, 02:18:59 PM
#78
If their genuine intent is to block SegWit, then I'm happy to go on record as stating that's unreasonable behaviour.  
That's a rather rational statement that is very rare in r/btc. I've seen fanatics praise the 'potential blocking' of Segwit. I've also seen absurd claims that Segwit is an altcoin.

I'm hoping that the actual reason has more to do with coding compatibility, for example, trying to get SegWit and Xthin working in conjunction and it taking a bit of time to figure that out before BU and others can add Segwit to their code.
It's up to the BU team to keep up to date with everything, and not up to the other teams to slow down if they can't catch up. Xthin wasn't really anything special, and BU is already outdated (isn't it at 0.12.1 IIRC?).

How do you mean 1 mb is to big maybe a cuouple of years ago the 1mb was to big for your mobile phone but dont think that in this time 1 mb is really less for something like that  you know.
You clearly have no idea what you're talking about so I'd advise you to stop posting nonsense.
legendary
Activity: 3430
Merit: 3074
October 17, 2016, 10:11:47 AM
#77
I'm hoping that the actual reason has more to do with coding compatibility, for example, trying to get SegWit and Xthin working in conjunction

That would be a little pointless really. Xthin turned out to be written incompetently, and we have a thin blocks relay protocol working now without Xthin. What next, fix Xthin so that it works with Compact Blocks?

Xthin devs (the Classic team, isn't it?) should quit while they're behind, IMO. They had their chance, they even got to market first, but it wasn't good enough.
sr. member
Activity: 244
Merit: 250
October 17, 2016, 09:50:08 AM
#76
"640 kB ought to be enough for anybody" - Bill Gates ... and look at us now.  Roll Eyes ... Bitcoin allow for scaling, but you should not run into it with

closed eyes, hoping you not going to kill yourself. This experiment needs cautious people with open minds... putting investors interest first. I

would like to see bigger blocks, but not at the expense of the whole experiment.  Roll Eyes

LOL, bitcoin block size can't be estimated, maybe in 2050, bitcoin block is 100MB, who knows? At that time, there will be over 10 M or even 100M people use bitocin, the population will be 10 B.

It can be estimated. The blocksize is too big if it does not allow for users to run their own nodes in a decent computer. It's as simple as that. If the node cannot be run on single computers but on specialized computers/farms, its over. Right now 1mb is a sweet spot where it allows for it to be run on single computers.
How do you mean 1 mb is to big maybe a cuouple of years ago the 1mb was to big for your mobile phone but dont think that in this time 1 mb is really less for something like
that  you know.
legendary
Activity: 3724
Merit: 3063
Leave no FUD unchallenged
October 17, 2016, 09:41:20 AM
#75
ViaBTC is a joke: "We want on chain scaling" -> Segwit release imminent (on chain scaling) -> "Let's block Segwit". A bit ironic, isn't it?

If their genuine intent is to block SegWit, then I'm happy to go on record as stating that's unreasonable behaviour.  We should at least give it a try and see what impact it has.  I'm hoping that the actual reason has more to do with coding compatibility, for example, trying to get SegWit and Xthin working in conjunction and it taking a bit of time to figure that out before BU and others can add Segwit to their code.
legendary
Activity: 2674
Merit: 2965
Terminated.
October 17, 2016, 01:26:55 AM
#74
-snip-
All this combined, I'm ready for bigger blocks Smiley
All of that is irrelevant to the question of whether that is (safely) feasible.

So are two mining pools and approximately 11% of the total hashrate (at the time of writing), it seems.  
ViaBTC is a joke: "We want on chain scaling" -> Segwit release imminent (on chain scaling) -> "Let's block Segwit". A bit ironic, isn't it?
legendary
Activity: 992
Merit: 1000
October 16, 2016, 12:36:06 AM
#73
Hash rate for BU will definitely increase as time goes on. 11% now, 20% will follow soon enough
legendary
Activity: 3724
Merit: 3063
Leave no FUD unchallenged
October 11, 2016, 06:52:54 AM
#72
All this combined, I'm ready for bigger blocks Smiley

So are two mining pools and approximately 11% of the total hashrate (at the time of writing), it seems.  

//EDIT:  Roughly 11% of nodes, too.
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
October 11, 2016, 05:44:35 AM
#71
It shouldn't stop if it isn't doing anything. Relying on HDD noise is a bad way of testing whether it's actually doing something. If the CPU load is low, then the bottleneck is the IOPS of the HDD.
I can now confirm the HDD was the limitation. I've upgraded from 4 to 12 GB ram, which made it possible to download a pruned blockchain (2.4 GB total with -prune=550) to a ram drive.
The result: Bitcoin Core used all CPU it could get continuously for processing blocks. It downloaded at maximum speed once in a while, then stopped downloading for a while, all the time still processing the blocks.
In total it downloaded approximately 86 GB, and it took 1 full day on an i3 (while also doing normal tasks). A huge speed improvement.

I also noticed  Bitcoin Core 0.13.0 is more responsive than the previous version. While synchronizing I can now click Send > Inputs and get the popup instantly. This used to take up to several minutes.

All this combined, I'm ready for bigger blocks Smiley
legendary
Activity: 2674
Merit: 2965
Terminated.
September 26, 2016, 12:20:25 PM
#70
And 98% of nodes right now can support 4MB blocks without a problem.      
I'd like to see the testing methodology used in order to draw to this conclusion. IIRC there was some 'research' being done by some team a while ago, stating something like this (although I'm unsure about the specifics and can't find it). Please post the source if you are able to.

I recommend 10mb size maybe its enough already.
That number is just arbitrary and useless.

1mb is clearly not enough but 10mb would be a shock to most systems.
A tenfold increase in resource usage is definitely a "killer".
legendary
Activity: 1288
Merit: 1087
September 24, 2016, 11:15:34 AM
#69
uh, you don't think the people who spend all day every day up to their eyes in bitcoin know less than you?

1mb is clearly not enough but 10mb would be a shock to most systems.
full member
Activity: 224
Merit: 100
September 24, 2016, 10:56:08 AM
#68
I can't figure out what you are taking about. But i guess that you are training about bitcoin block size, but as a programmer 1MB size is not a big size for me.  Since bitcoin have millions of block so obviously that would sum up a huge number.

Yes i agree with you. That 1mb size is not enough size,  we should always consider the millions block of bitcoin.
I recommend 10mb size maybe its enough already.
legendary
Activity: 1106
Merit: 1005
September 23, 2016, 10:47:42 AM
#67


I don't think it has anything to do with being a "rich guy's club." Being able to run a node shouldn't be a "rich guy's club" either. And though mining already is, do we want to exacerbate that? Maxblocksize is very much about preventing the externalities that limit node/miner participation into more and more centralized groups.

I think "artificial limit" implies a misconception. Bitcoin as a system was a matter of designing a system that optimally aligns market incentives to ensure a decentralized, secure network. We cannot have talk about an "artificial limit" without mention that this limit is specifically designed to ensure decentralization and security---the former by maximizing the ability to run nodes and participate in mining, and the latter by forcing users to pay some of the cost of confirmed transactions, rather than externalizing all costs onto nodes and miners (forcing some off the network). I take exception when people try to make this about "free markets vs. central planning."

Bitcoin was centrally planned from Day 1; the entire point is trying to design the optimal system, with regard for all of its parts. The point wasn't to disregard most of the system so we can guarantee cheap/free transactions for users.

blocksize doesn't affect miners as much as nodes. Miners who can afford to mine in this market can afford to have bigger blocks too, because blocksize is limited by nodes, not miners.

Just that miners will have to receive more than 4mb of transactions before they can start mining. Tongue

Which is not a problem at all, download speed of miners is not a bottleneck, and you need huge amounts of money to even be competitive in the mining industry in the first place so they can all afford internet. You don't even need fast internet in the first place. But even if they would, they could afford it. It will not at all increase centralization.


And 98% of nodes right now can support 4MB blocks without a problem.      


And the nodes that can't, really shouldn't be on the network. We don't want weak links in the chain.

You'll never reach 100%, 98% is more than acceptable.

Not upgrading hurts more people than upgrading. In the longer term, more people will run nodes (because there will be more adopters), so the weak nodes will be replaced.
And the nodes that can't, really shouldn't be on the network. We don't want weak links in the chain.


legendary
Activity: 854
Merit: 1000
September 23, 2016, 05:07:19 AM
#66
"640 kB ought to be enough for anybody" - Bill Gates ... and look at us now.  Roll Eyes ... Bitcoin allow for scaling, but you should not run into it with

closed eyes, hoping you not going to kill yourself. This experiment needs cautious people with open minds... putting investors interest first. I

would like to see bigger blocks, but not at the expense of the whole experiment.  Roll Eyes

LOL, bitcoin block size can't be estimated, maybe in 2050, bitcoin block is 100MB, who knows? At that time, there will be over 10 M or even 100M people use bitocin, the population will be 10 B.
I dont think 100 mb would be considered a lot in 2050 with much better devices.Earlier even 1 mb was huge due to slow internet.Technology going to improve for better in future for sure
legendary
Activity: 3290
Merit: 16489
Thick-Skinned Gang Leader and Golden Feather 2021
September 23, 2016, 04:46:55 AM
#65
Block size certainly affects miners greatly. An entire block needs to be relayed across all nodes in a timely manner for mining to be decentralized. The more data propagated, the more latency.
That's the same for all miners. Even if they spend a few seconds downloading the block, and all miners lose a few seconds from the 600 seconds average between blocks, the difficulty will correct for this. One way or another, there will be one block every 10 minutes.

I think Mike Hearn, who blames the Chinese “Great Firewall” censorship system for limiting the Chinese miners, could be right. They have the advantage of very cheap (and dirty) electricity, but they don't have decent internet.
And since the hash rate puts them in power, they can stop blocks from growing to protect their interests.

There could be any number of reasons why blocks aren't filling up (or why average block size spiked earlier in the year). There may have been ongoing DOS/spam attacks that have died down. Services (and people) have probably gotten smarter about batching payments and cutting out unnecessary spends (I have). One would think that rationally, miners would be picking up transactions with low (but non-zero) fees instead of mining non-full blocks... so it may be a matter of fee policy enforcement, as we do see a steady ~2000 unconfirmed transactions even after many 100-600kb blocks in a row.
Mike Hears says this:
Quote
The reason the true limit seems to be 700 kilobytes instead of the theoretical 1000 is that sometimes miners produce blocks smaller than allowed and even empty blocks, despite that there are lots of transactions waiting to confirm
It could very well be the miners simply don't care about the transaction fees! They get 12.5 BTC, and if they spend a few seconds on processing waiting transactions, they risk losing the block to someone else. So they skip the transactions and claim the 12.5 BTC as fast as they can. If this is the case, the block reward does the opposite of what it should do: enable transactions! It may need several more halvings for this behaviour to change.
hero member
Activity: 697
Merit: 520
September 22, 2016, 05:49:46 PM
#64


I don't think it has anything to do with being a "rich guy's club." Being able to run a node shouldn't be a "rich guy's club" either. And though mining already is, do we want to exacerbate that? Maxblocksize is very much about preventing the externalities that limit node/miner participation into more and more centralized groups.

I think "artificial limit" implies a misconception. Bitcoin as a system was a matter of designing a system that optimally aligns market incentives to ensure a decentralized, secure network. We cannot have talk about an "artificial limit" without mention that this limit is specifically designed to ensure decentralization and security---the former by maximizing the ability to run nodes and participate in mining, and the latter by forcing users to pay some of the cost of confirmed transactions, rather than externalizing all costs onto nodes and miners (forcing some off the network). I take exception when people try to make this about "free markets vs. central planning."

Bitcoin was centrally planned from Day 1; the entire point is trying to design the optimal system, with regard for all of its parts. The point wasn't to disregard most of the system so we can guarantee cheap/free transactions for users.

blocksize doesn't affect miners as much as nodes. Miners who can afford to mine in this market can afford to have bigger blocks too, because blocksize is limited by nodes, not miners.

And 98% of nodes right now can support 4MB blocks without a problem.     

Block size certainly affects miners greatly. An entire block needs to be relayed across all nodes in a timely manner for mining to be decentralized. The more data propagated, the more latency.

Where do you get this 98% number? The IC3 paper suggested that 4MB blocks would kick 10% of nodes off the network, and they only tested propagation effects.
sr. member
Activity: 378
Merit: 250
September 22, 2016, 05:47:56 PM
#63


I don't think it has anything to do with being a "rich guy's club." Being able to run a node shouldn't be a "rich guy's club" either. And though mining already is, do we want to exacerbate that? Maxblocksize is very much about preventing the externalities that limit node/miner participation into more and more centralized groups.

I think "artificial limit" implies a misconception. Bitcoin as a system was a matter of designing a system that optimally aligns market incentives to ensure a decentralized, secure network. We cannot have talk about an "artificial limit" without mention that this limit is specifically designed to ensure decentralization and security---the former by maximizing the ability to run nodes and participate in mining, and the latter by forcing users to pay some of the cost of confirmed transactions, rather than externalizing all costs onto nodes and miners (forcing some off the network). I take exception when people try to make this about "free markets vs. central planning."

Bitcoin was centrally planned from Day 1; the entire point is trying to design the optimal system, with regard for all of its parts. The point wasn't to disregard most of the system so we can guarantee cheap/free transactions for users.

blocksize doesn't affect miners as much as nodes. Miners who can afford to mine in this market can afford to have bigger blocks too, because blocksize is limited by nodes, not miners.

Just that miners will have to receive more than 4mb of transactions before they can start mining. Tongue


And 98% of nodes right now can support 4MB blocks without a problem.     



And the nodes that can't, really shouldn't be on the network. We don't want weak links in the chain.
legendary
Activity: 1106
Merit: 1005
September 22, 2016, 05:33:10 PM
#62


I don't think it has anything to do with being a "rich guy's club." Being able to run a node shouldn't be a "rich guy's club" either. And though mining already is, do we want to exacerbate that? Maxblocksize is very much about preventing the externalities that limit node/miner participation into more and more centralized groups.

I think "artificial limit" implies a misconception. Bitcoin as a system was a matter of designing a system that optimally aligns market incentives to ensure a decentralized, secure network. We cannot have talk about an "artificial limit" without mention that this limit is specifically designed to ensure decentralization and security---the former by maximizing the ability to run nodes and participate in mining, and the latter by forcing users to pay some of the cost of confirmed transactions, rather than externalizing all costs onto nodes and miners (forcing some off the network). I take exception when people try to make this about "free markets vs. central planning."

Bitcoin was centrally planned from Day 1; the entire point is trying to design the optimal system, with regard for all of its parts. The point wasn't to disregard most of the system so we can guarantee cheap/free transactions for users.

blocksize doesn't affect miners as much as nodes. Miners who can afford to mine in this market can afford to have bigger blocks too, because blocksize is limited by nodes, not miners.

And 98% of nodes right now can support 4MB blocks without a problem.     

hero member
Activity: 697
Merit: 520
September 22, 2016, 05:02:44 PM
#61
If we perpetually increase capacity ahead of demand, fees will never rise. Transactions will always be free or nearly free for users. This will not end well in a future where block subsidy ends and fees alone must support the security of the network (by incentivizing miners). If we did this, we are basically depending on mass adoption and skyrocketing price being guaranteed. That's probably not a good engineering decision.
The demand is there already.
By not increasing capacity, mass adoption becomes impossible. It's already not possible for just 1 million people to make a few transactions per day, let alone much more people.

I don't think you understand what I said. The demand should outweigh capacity if fees are to rise. And fees must rise significantly if the chain is to remain secure many years from now. And regarding capacity....firstly, Segwit, Schnorr and other optimizations + LN will drastically increase capacity. Secondly, it's not clear that fees are discouraging adoption at all---data, please? For those that view BTC as digital gold, 10 or 20 cent fees are not discouraging at all.

Even though I understand, and quite agree to a rise in transaction fees, I don't like the idea that demand should outweigh capacity. Do we want BTC to be a rich guy's club or to become the world's leading cryptocurrency, and to change the world?

Look at Ferrari. They artificially limit their production. They could sell more but they don't to protect the value of their brand, and the exclusivity of their products. I guess it suits them well, but you must look at Ford or Toyota to see who is changing the world.

I don't think it has anything to do with being a "rich guy's club." Being able to run a node shouldn't be a "rich guy's club" either. And though mining already is, do we want to exacerbate that? Maxblocksize is very much about preventing the externalities that limit node/miner participation into more and more centralized groups.

I think "artificial limit" implies a misconception. Bitcoin as a system was a matter of designing a system that optimally aligns market incentives to ensure a decentralized, secure network. We cannot have talk about an "artificial limit" without mention that this limit is specifically designed to ensure decentralization and security---the former by maximizing the ability to run nodes and participate in mining, and the latter by forcing users to pay some of the cost of confirmed transactions, rather than externalizing all costs onto nodes and miners (forcing some off the network). I take exception when people try to make this about "free markets vs. central planning."

Bitcoin was centrally planned from Day 1; the entire point is trying to design the optimal system, with regard for all of its parts. The point wasn't to disregard most of the system so we can guarantee cheap/free transactions for users.
legendary
Activity: 3052
Merit: 1047
Your country may be your worst enemy
September 22, 2016, 04:25:51 PM
#60
If we perpetually increase capacity ahead of demand, fees will never rise. Transactions will always be free or nearly free for users. This will not end well in a future where block subsidy ends and fees alone must support the security of the network (by incentivizing miners). If we did this, we are basically depending on mass adoption and skyrocketing price being guaranteed. That's probably not a good engineering decision.
The demand is there already.
By not increasing capacity, mass adoption becomes impossible. It's already not possible for just 1 million people to make a few transactions per day, let alone much more people.

I don't think you understand what I said. The demand should outweigh capacity if fees are to rise. And fees must rise significantly if the chain is to remain secure many years from now. And regarding capacity....firstly, Segwit, Schnorr and other optimizations + LN will drastically increase capacity. Secondly, it's not clear that fees are discouraging adoption at all---data, please? For those that view BTC as digital gold, 10 or 20 cent fees are not discouraging at all.

Even though I understand, and quite agree to a rise in transaction fees, I don't like the idea that demand should outweigh capacity. Do we want BTC to be a rich guy's club or to become the world's leading cryptocurrency, and to change the world?

Look at Ferrari. They artificially limit their production. They could sell more but they don't to protect the value of their brand, and the exclusivity of their products. I guess it suits them well, but you must look at Ford or Toyota to see who is changing the world.
Pages:
Jump to: