Pages:
Author

Topic: Permanently keeping the 1MB (anti-spam) restriction is a great idea ... - page 13. (Read 105069 times)

hero member
Activity: 772
Merit: 501
I've read at least hundreds of posts on this topic, but OP's post is far and away the most convincing of any I've read.

+1
jr. member
Activity: 46
Merit: 1
I've read at least hundreds of posts on this topic, but OP's post is far and away the most convincing of any I've read.
sr. member
Activity: 435
Merit: 250
Block size has to be incrassed thats obvious
sr. member
Activity: 346
Merit: 250
Thank you OP. This perspective is sorely needed.

Those wishing to keep a small block size are in favor of a useless Bitcoin network. Some may be simply being manipulated by some half baked economic theory that says miners will go broke if they process lots of transactions and the protocol gives them the ability to process many more per block...but I suspect there are people who truly wish to try and make raising the limit a real problem for the dev community.

People advocating for a tiny max block size are no friends of Bitcoin.

+1

+2

Totally agree and my vote is with D&T's support of increased block size.

In agreement here, too.  And the best part is, it doesn't matter what the anti-fork crowd think, because whatever they say, we can go ahead and do it without them.  They can stay on their old, limited chain if they want, but they can't force the rest of us to stay.    



Roll Eyes Grin Cheesy
legendary
Activity: 1400
Merit: 1013
I was already assuming a perfectly idealized p2p network that had no overhead or sub-linear scaling. I've done as much to explore the space of efficiency gains in this kind of system as any two other people combined here, come on. Please don't try to play off that I don't know how the system works.
What I mean is that your perfectly idealized p2p network is still wrong.

A more detained explanation is forthcoming.
This is the explanation to which I was referring:
http://bitcoinism.liberty.me/2015/02/09/economic-fallacies-and-the-block-size-limit-part-2-price-discovery/
legendary
Activity: 3934
Merit: 3190
Leave no FUD unchallenged
Thank you OP. This perspective is sorely needed.

Those wishing to keep a small block size are in favor of a useless Bitcoin network. Some may be simply being manipulated by some half baked economic theory that says miners will go broke if they process lots of transactions and the protocol gives them the ability to process many more per block...but I suspect there are people who truly wish to try and make raising the limit a real problem for the dev community.

People advocating for a tiny max block size are no friends of Bitcoin.

+1

+2

Totally agree and my vote is with D&T's support of increased block size.

In agreement here, too.  And the best part is, it doesn't matter what the anti-fork crowd think, because whatever they say, we can go ahead and do it without them.  They can stay on their old, limited chain if they want, but they can't force the rest of us to stay.   
hero member
Activity: 772
Merit: 501
Thank you OP. This perspective is sorely needed.

Those wishing to keep a small block size are in favor of a useless Bitcoin network. Some may be simply being manipulated by some half baked economic theory that says miners will go broke if they process lots of transactions and the protocol gives them the ability to process many more per block...but I suspect there are people who truly wish to try and make raising the limit a real problem for the dev community.

People advocating for a tiny max block size are no friends of Bitcoin.

+1
donator
Activity: 1464
Merit: 1047
I outlived my lifetime membership:)
Thank you OP. This perspective is sorely needed.

Those wishing to keep a small block size are in favor of a useless Bitcoin network. Some may be simply being manipulated by some half baked economic theory that says miners will go broke if they process lots of transactions and the protocol gives them the ability to process many more per block...but I suspect there are people who truly wish to try and make raising the limit a real problem for the dev community.

People advocating for a tiny max block size are no friends of Bitcoin.
legendary
Activity: 868
Merit: 1006
Lol at thinking we can live forever with the amazingly shitty 1MB limit. Get a grip boys.
legendary
Activity: 1904
Merit: 1007
I read and...?

Did I have to agree with the solution?

The main issue for me is we need to promote and secure the decentralization, and now Bitcoin is more centralized than the banking system. A single guy not a company or a group single people controls 1%+ of the system and most of the system is in a very small club of people this is not decentralized at all and that makes us a weak system simple to attack.

Juan

So you are against rasing the block limits because of the space problem and you don't agree with the blockchain pruning solution? Ok. That makes a lot of sense.
hero member
Activity: 658
Merit: 500
The main issue for me is we need to promote and secure the decentralization, and now Bitcoin is more centralized than the banking system. A single guy not a company or a group single people controls 1%+ of the system and most of the system is in a very small club of people this is not decentralized at all and that makes us a weak system simple to attack.

Juan

So, Bitcoin is more centralized than the banking system? So, that means I can no longer make a BTC1000 transaction without explaining myself why and how I would do such a thing?
hero member
Activity: 532
Merit: 500
TaaS is a closed-end fund designated to blockchain
If the Blocksize increase exponentially the numbers or nodes may drop in a significant way just because they require to much disk, actually the network increase but there are not so many full nodes (nodes sharing the blocks)

Have you read all post from this thread? The "much disk" will not be needed! We have blockchain pruning.

Well if the size of the block rise the size of blockchain database may rise or the upgrade is useless, bigger size more disk space and in my opinion full nodes are not so many.

Regards

Juan


Ok you are unable to read what people post here. Let me spell it for you:

B-L-O-C-K-C-H-A-I-N  P-R-U-N-I-N-G + cheapening of $/TB

Maybe now you get it!


I read and...?

Did I have to agree with the solution?

The main issue for me is we need to promote and secure the decentralization, and now Bitcoin is more centralized than the banking system. A single guy not a company or a group single people controls 1%+ of the system and most of the system is in a very small club of people this is not decentralized at all and that makes us a weak system simple to attack.

Juan

legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Satoshi didn't have a 1MB limit in it. The limit was originally Hal Finney's idea.  Both Satoshi and I objected that it wouldn't scale at 1MB.  Hal was concerned about a potential DoS attack though, and after discussion, Satoshi agreed.  The 1MB limit was there by the time Bitcoin launched.

It would be great if Satoshi would chime in.  Maybe if the coin does indeed begin to snap in two?

There is no need for Satoshi to chime in (although if he did reappear I have a list of questions).   The first version of the client had no block size limit.   The second version of the client had no block size limit.  The next 146 commits to the repo had no block size limit.  The source code is the proof.   The block size limit wasn't added as an anti-spam mechanism until more than 21 months after the genesis block.

I think Satoshi was worried about rapid improvements in hashing power giving one rogue miner the ability to bloat the blockchain with a series of large (32MB?) blocks. The first mention of using FPGAs that I can find on Bitcointalk is July 2010, three months before the 1MB change. At the time Bitcoin was gaining traction with a community behind it, and a serious spam attack would have damaged its progress.

Thanks for the hashing analysis from a much more experienced perspective! I am still interested in how this little processor can do... even if I was off by a factor of about 10, it might still be competitive with much more expensive and energy intensive desktop processors. I can get about 2100 khash/sec using all 4 cores of my 64 bit machine when the system is otherwise idle, and that certainly makes the fans blow a lot of hot air. I though it might be possible for VIA to overcome because custom circuits (FPGA or ASIC) for some cryptographic functions have in the past proved orders of magnitude faster than general desktop processors or even GPUs.

(my bold emphasis)
legendary
Activity: 1904
Merit: 1007
If the Blocksize increase exponentially the numbers or nodes may drop in a significant way just because they require to much disk, actually the network increase but there are not so many full nodes (nodes sharing the blocks)

Have you read all post from this thread? The "much disk" will not be needed! We have blockchain pruning.

Well if the size of the block rise the size of blockchain database may rise or the upgrade is useless, bigger size more disk space and in my opinion full nodes are not so many.

Regards

Juan


Ok you are unable to read what people post here. Let me spell it for you:

B-L-O-C-K-C-H-A-I-N  P-R-U-N-I-N-G + cheapening of $/TB

Maybe now you get it!
hero member
Activity: 532
Merit: 500
TaaS is a closed-end fund designated to blockchain
If the Blocksize increase exponentially the numbers or nodes may drop in a significant way just because they require to much disk, actually the network increase but there are not so many full nodes (nodes sharing the blocks)

Have you read all post from this thread? The "much disk" will not be needed! We have blockchain pruning.

Well if the size of the block rise the size of blockchain database may rise or the upgrade is useless, bigger size more disk space and in my opinion full nodes are not so many.

Regards

Juan
hero member
Activity: 772
Merit: 501
They can be solved simultaneously. In the mean time, Bitcoin needs to seize the opportunity it has to attain mass adoption, which I believe cannot happen with a limit of 1,800 txs per block, due to the basic constraint it places on access per user.
hero member
Activity: 532
Merit: 500
TaaS is a closed-end fund designated to blockchain
We see a huge concentration in mining environment 4-5 hands control 75%+ of mining power (Pools and Large Ops)

If the Blocksize increase exponentially the numbers or nodes may drop in a significant way just because they require to much disk, actually the network increase but there are not so many full nodes (nodes sharing the blocks)

The dominance of large mining pools is not due to the cost of running a node. There are thousands of full nodes, and yet the top 4-5 pools direct the majority of the network hashrate, as you note. The reason large pools have high hashrates is that pool size reduces the payout variance to miners, so miners prefer to use large pools.

So you need to be clear what you're trying to solve, and what is the cause of the problem. Because keeping the block size at 1 MB forever is not going to reduce the dominance of mining pools.

I am just saying there are some issues to be solved more relevant than the size of the block.

Juang
donator
Activity: 1218
Merit: 1079
Gerald Davis
Satoshi didn't have a 1MB limit in it. The limit was originally Hal Finney's idea.  Both Satoshi and I objected that it wouldn't scale at 1MB.  Hal was concerned about a potential DoS attack though, and after discussion, Satoshi agreed.  The 1MB limit was there by the time Bitcoin launched.

It would be great if Satoshi would chime in.  Maybe if the coin does indeed begin to snap in two?

There is no need for Satoshi to chime in (although if he did reappear I have a list of questions).   The first version of the client had no block size limit.   The second version of the client had no block size limit.  The next 146 commits to the repo had no block size limit.  The source code is the proof.   The block size limit wasn't added as an anti-spam mechanism until more than 21 months after the genesis block.
member
Activity: 63
Merit: 10
Satoshi didn't have a 1MB limit in it. The limit was originally Hal Finney's idea.  Both Satoshi and I objected that it wouldn't scale at 1MB.  Hal was concerned about a potential DoS attack though, and after discussion, Satoshi agreed.  The 1MB limit was there by the time Bitcoin launched.

It would be great if Satoshi would chime in.  Maybe if the coin does indeed begin to snap in two?
full member
Activity: 224
Merit: 100
The issue is not whether or not a larger block is technically advantageous, it clearly is.

The issue is that many people will not update or go with the new fork, thus creating mass chaos.
Pages:
Jump to: