Pages:
Author

Topic: Get rid of "difficulty" and maintain a constant rate. - page 2. (Read 19923 times)

full member
Activity: 224
Merit: 140
I"m not part of the development team, but my take on it is that you'll just be replacing the randomness with another randomness. Right now, even though the difficulty is very high, blocks are still being generated in under 3 to 5 minutes.

Block generation is at roughly every 10-15 minutes right now.  See:

http://nullvoid.org/bitcoin/statistix.php

for a current report on some statistical averages over the last several blocks that have been generated.  Still, the general point is valid.  Some blocks are being generated in under ten seconds from the previous one, but statistical averages still exist.

I do see the variable time between blocks, and in particular the predictive quality about when the difficulty is going to increase as something which could be used as a manipulation target after a fashion, although I should point out that any such manipulation would by definition also require CPU processing ability that approaches at least a substantial minority of the overall CPU strength of the network as a whole which is engaged in creating bitcoins.  

I give that last little exception as I hope it will become apparent that in time there will start to be people dropping out of the bitcoin creation process thinking that the whole effort is futile even if maintaining a connection on the network for the purposes of transaction processing could still be useful.  I'm curious about where that will go over time.

The strength of the network is in the overwhelming number of participants where even somebody with a (temporarily) unused server room at their disposal doing nothing but making bitcoin blocks still is a minority of the overall network.  Furthermore, having a couple of "trusted" participants with server farms who are cooperatively making blocks only enhances this protection for everybody and keeps the would-be miscreants at bay.

The only manipulation that I can imagine where this proposal would help is in the case of an attacker who times the connection and release of significant computing resources on the network, where for some periods of time the CPU server farm is banging out the bitcoin blocks and then leaves the network when the difficulty increases substantially.... waiting for that difficulty to drop back to what it was before it started to make the bitcoin blocks (doing other stuff in the meantime or even simply shutting down).  Such efforts over a prolonged period of time, if successful, could also be derived and even plotted statistically to show an attack was under way.  Randomizing the attacks to make it seem like "noise" would only serve to drop the value of such an attack.  Trying to sneak in under the radar to appear as a "normal" user would end up simply adding strength to the network against other would-be attackers and in the long run be ineffective in their attack.  Attackers would be fighting each other and normal users could simply be oblivious that anything is happening at all in terms of an attack.
sr. member
Activity: 252
Merit: 268
I had this idea myself and it's pretty much the same solution in a different form. Yeah, the timing of blocks would be more consistent, but in the current implementation, the timing is consistent if you take the average time it takes to generate blocks over a long period of time. In the current implementation, it's easy to measure sudden increases and decreases in the swarm. In the suggested implementation, you could also calculate sudden increases and decreases in the swarm by the lowness of the hash, but it would be much less noticeable.

If confirmations suddenly increase or decrease dramatically, it warns users that there is a rush of new users or the abandonment of a botnet, which may cause the exchange rate to fluctuate.

In the current implementation, it's a race toward the lowest time with a set low hash, while in the suggested implementation, it would be a race toward the lowest hash with a set low time. The slow CPU would be just as likely to generate a block. It's competing in the same way, just with goals and limits reversed.

Edited a few times.
full member
Activity: 210
Merit: 100
This is a very very very interesting idea.  It does seem to "automatically" solve the difficulty problem.

To extend it just a bit, a node should broadcast its block as soon as it finds the new lowest hash, even if its not close to the ten minute mark.  Then, nodes would only broadcast if their new hash was lower then that one and so on.  This would help minimize the effects latency and of the nodes' clocks being slightly off.

I'd have to think about this a lot more, but you might be on to something...

It's not ten minutes, it is 2016 blocks.

And with your variant: imagine: by some sheer luck, some machine generated a block with a VERY VERY low hash. Then if other machines pick this low hash as a target, most of the blocks that would suit the target otherwise will be dropped. And only after the 2016 block cycle ends, an easier target will be set.

Target is not the thing that only decreases, it may increase (for example, if some nodes leave the network or stop generating, the "still generating" nodes should get a better chance to keep emission at the required level)
sr. member
Activity: 308
Merit: 256
I"m not part of the development team, but my take on it is that you'll just be replacing the randomness with another randomness. Right now, even though the difficulty is very high, blocks are still being generated in under 3 to 5 minutes. So if this new system was in place, you would still be waiting for a block just as long as you would now. I don't usually disclose how many PCs I have in the BTC network for sanity reasons, but let me say that I have systems that can barely manage 90 khash/s and a few that are chruning out 19,2000 khash/s and one beast doing 38,400 khash/s. They don't win any more blocks than the much slower PCs does. One of my 900MHz PCs solved a block under 100 seconds by pure chance alone after the difficulty was increased. The other super clusters are still 0 after the difficulty went up earlier today.

I'm afraid your solution would give my super clusters a big advantage because then it becomes they will always have the lowest hashed block if it's a CPU vs CPU thing.
member
Activity: 70
Merit: 11
This is indeed an interesting idea. I'm curious what the devs would think about this idea. It could always be implemented on the test network first.
member
Activity: 103
Merit: 61
This is a very very very interesting idea.  It does seem to "automatically" solve the difficulty problem.

To extend it just a bit, a node should broadcast its block as soon as it finds the new lowest hash, even if its not close to the ten minute mark.  Then, nodes would only broadcast if their new hash was lower then that one and so on.  This would help minimize the effects latency and of the nodes' clocks being slightly off.

I'd have to think about this a lot more, but you might be on to something...
sr. member
Activity: 416
Merit: 277
The primary purpose of generating BitCoins is to provide an incentive for people to participate in the maintainance of block chain. Generating BitCoins out of "thin air" has recently captured the imagination of a set of new users (me included) and the sudden increase in available computing power has meant a dramatic increase in the rate of block generation.

The increased rate doesn't have any substantial disadvantages or risks that I can see but the variability of the rate is inelegant and it seems to attract a lot of discussion on IRC which distracts from more important issues. I can make a stronger case for the undesirability of an increased rate if required.

The difficulty of block generation will increase to counteract the influx of processing power and the generation rate will normalize after some delay. I predict that new users become disillusioned with the apparently unproductive use of their computer time (especially compared with their experiences in generating coins easily before the difficulty increase) and leave en-masse. The difficulty will not ramp down fast enough to offset this and we will be left with a period of very slow block generation. This will result in trades taking an irritatingly long time to confirm and arguably leaves the system more susceptible to certain types of fraud.

I predict that sucessful fraud schemes will be preceeded by manipulation of the rate by untraceably and deniably introducing and withdrawing substantial hash computation resources.

It would be much more elegant to be able to rely on blocks being generated regularly at 10 minute intervals (or whatever rate is agreed upon). I believe this can be achieved with only a modest increase in bandwidth.

Simply, as the 10 minutes (or whatever) is about to elapse, hash generating computers broadcast the block they have found with the lowest hash. The other computers briefly stop to check the hash and they only broadcast their block if it has an even lower hash. At the 10 minute mark the lowest hashed block is adopted to continue the chain.

There are some details to iron out to do with how low the hash has to be versus the time elapsed before you bother breaking the silence and broadcasting it but I believe that this would be a more elegant solution to the rate problem.  People could rely on a fixed number of blocks being generated a day at fixed times or whatever timetable was mutually agreed.

ByteCoin 

     
Pages:
Jump to: