Pages:
Author

Topic: p2pool - Decentralized, Absolutely DoS-Proof, Pool Hopping-Proof Pool [archival] - page 27. (Read 35513 times)

hero member
Activity: 516
Merit: 643
I'm back and I've resumed full time work on this. Smiley Watch SVN if you're interested...
sr. member
Activity: 350
Merit: 250
this is definitely a step in the right direction and I will be watching this closely. We need to always always thinks of new ways to decentralize.
member
Activity: 112
Merit: 10
Ride or Die
Keeping track of the last 600 shares in a distributed environment seems hard.  Let me suggest an alternative for you that seems much simpler, and in fact I came up with specifically to design a decentralized pool:
http://forum.bitcoin.org/index.php?topic=25540.0
noone in that thread understood it, but hopefully as a programmer it should be clear to you.  The idea is simple: just auction the block reward to the N highest bidders, where a "bid" is a share and "high" means "low hash".
Your method there is rather askew from the standard btc client and mining protocol, no?
sr. member
Activity: 686
Merit: 259
This is hard to make as a P2P botnet  Grin
Can't wait to try this Cheesy
full member
Activity: 372
Merit: 114
Keeping track of the last 600 shares in a distributed environment seems hard.  Let me suggest an alternative for you that seems much simpler, and in fact I came up with specifically to design a decentralized pool:

http://forum.bitcoin.org/index.php?topic=25540.0

noone in that thread understood it, but hopefully as a programmer it should be clear to you.  The idea is simple: just auction the block reward to the N highest bidders, where a "bid" is a share and "high" means "low hash".
full member
Activity: 158
Merit: 100
aquí dice algo personal.
member
Activity: 126
Merit: 60
I majorly support this project.

once I get some coins I will try to donate what the wife doesn't bleed out of me.
member
Activity: 112
Merit: 10
Ride or Die
About the first point : okay, I didn't consider that. This pool is still not very scalable though (even if it represents 10% of the network, that's 1 share every 10 seconds).
About the second point : seems complicated. How about simply putting the 599 previous shares in every found share, and when one is a "winner", pay all the previous 599 shares ? Regardless whether they are from a previous block or not. This is much simpler, allows complete decentralization and still pool-hop resistant.
Brilliant plan!
1) Totally decentralized
2) Each client can pick the Factor (ease factor) {F} from 1 = no sharing (they keep the whole blockreward), up to an arbitrary maximum of 600 (integer only).
3) Clients will only share work history with other clients set to the SAME ease factor (as normal BTC client does) and building a chain of addresses up to {F} length (older address/submissions are dropped after normal btc chain confirms the new ones are validly added, maybe 12 rounds of normal btc chain adds) This will allow picking which network to work in based on your own variables and hopefully there will be some stats on the different networks block history to help choose where to start.
4) If blocks are added too quickly, client can be set to automatically lower the ease Factor (go to {F} = {F} - 1), so every time the network gets to be at the critical size, it automatically slows down the fastest miners. I'd suggest 10 minutes as the target time for submitting blocks across each {F} network as a whole, and no single miner on each {F} network is allowed to submit 2 blocks in 10 minutes, unless the second is a winning block. So if a client would submit 2 in 10 minutes, instead it lowers down {F} factor.
5) blocks are based off the current or previous main BTC blocks, so if work is submitted promptly, it must be included by the other miners onto the p2p block chain, with minimal stale blocks. It will use a similar check for adding blocks as the main client, with the added check of comparing the work to main client (current or previous round).
6) If a winning block is found before {F} blocks were submitted, the winner keeps the extra shares.


EXAMPLE:
At F = 600, XXX miners connect and mine fast, building the speed up to submit shares every 10 minutes. blocks are created, adding to the chain. at 400 chain length, the next block is a winner, giving winner 1/3 of reward (credit for blocks 1-199 & 600 -or- 1 & 402-600, depending how you count it), and the previous 400 submitters get rewarded for each block they added (200-599)
After blocks are found faster, maybe if 100 blocks were added in last hour, triggers the split for all clients set to auto lower {F} - all client set to auto adjust go to (or create if necessary) a new network.
Also, Any clients that find 2 blocks per 10 minutes will do the same. Being the first miners on any {F} will be just as good as being a later miner, allowing them to add blocks and start the chain, to eventually get all the shares contributed.
On long rounds, the earliest shares contributed will not get rewarded, but that's a drawback for not contributing shares continuously.
sr. member
Activity: 252
Merit: 250
Working on it - I've pretty much completely formulated the 'plan', and I'm now beginning updating the code.

I've been on vacation since the day after the initial release, so I haven't had much time to code (though I've had lots of time to think!).

Nice news! Recently I've SVN-loaded the code and some errors appeared when miner is on.

I'll wait until you say it is ready for re-testing.
hero member
Activity: 516
Merit: 643
Working on it - I've pretty much completely formulated the 'plan', and I'm now beginning updating the code.

I've been on vacation since the day after the initial release, so I haven't had much time to code (though I've had lots of time to think!).
full member
Activity: 154
Merit: 100
Any updates on the successor so that we can try it out? Thanks!
full member
Activity: 158
Merit: 100
aquí dice algo personal.

If, let's say, I have 1 Gh/s of mining the current network hashrate is: 11148 GH/s. Is the suggestion that every time a block is found (every ~10 minutes) that I would get 1/11148 of 50BTC (or 0.00448511) every ~10 minutes?



Probably that is the case
full member
Activity: 168
Merit: 100
Having this integrated with the BitCoin client would be very nice and I strongly agree that this should have been the mining process all along.

So I'm just trying to understand the implication of this:

If, let's say, I have 1 Gh/s of mining the current network hashrate is: 11148 GH/s. Is the suggestion that every time a block is found (every ~10 minutes) that I would get 1/11148 of 50BTC (or 0.00448511) every ~10 minutes?

full member
Activity: 154
Merit: 100
Having this integrated with the BitCoin client would be very nice and I strongly agree that this should have been the mining process all along.
full member
Activity: 158
Merit: 100
aquí dice algo personal.
maybe if you create a token which find another ones maybe you can do a solution but I don't know how it can be done. If you can get a a selectable group of decentralized miners...It can be a way.
hero member
Activity: 658
Merit: 500

p2pool is a great idea! I bet it will sometime implemented naturally on bitcoin client because it is the p2p-way to mine, as the whole bitcoin project is. Go on!

there should be a more general solution where you join a network and then the bitcoins get distributed by the client itself somehow without any kind of operator
in fact this should be done with the bitcoin network somehow
sr. member
Activity: 252
Merit: 250
What miner are you using? It seems to be submitting shares with a fixed difficulty of 1 rather than paying attention to the 'target' in the getwork message, in which case frequent messages like this would be normal.

cpuminer

I'm using a toy miner on p2pool, only for testing purposes.

I'm working on a successor to p2pool that is completely decentralized. I'd recommend not trying to use p2pool as it is now; technically it's fine - there network is up and I'm mining on it, but it probably won't generate a block before I finish the successor. Also, accordingly, the SVN repo is in flux; if you still want to use p2pool, use the released version.

p2pool is a great idea! I bet it will sometime implemented naturally on bitcoin client because it is the p2p-way to mine, as the whole bitcoin project is. Go on!

So I'll follow your advice and I'll get fresh versions from SVN for my tests.
hero member
Activity: 516
Merit: 643
What miner are you using? It seems to be submitting shares with a fixed difficulty of 1 rather than paying attention to the 'target' in the getwork message, in which case frequent messages like this would be normal.

I'm working on a successor to p2pool that is completely decentralized. I'd recommend not trying to use p2pool as it is now; technically it's fine - there network is up and I'm mining on it, but it probably won't generate a block before I finish the successor. Also, accordingly, the SVN repo is in flux; if you still want to use p2pool, use the released version.
sr. member
Activity: 252
Merit: 250
There just aren't many users currently (only me at the moment). Can you post any Python error messages you see?


Code:
GOT SHARE! afe24d1fd5f252cceb1b781f683db4bdddc45da174456f5f0d2d16ef

Error processing data received from worker:
Traceback (most recent call last):
  File "main.py", line 399, in got_response
    p2p_share(share)
  File "main.py", line 245, in p2p_share
    res = chain.accept(share, net)
  File "main.py", line 55, in accept
    share2 = share.check(self, height, previous_share2, net) # raises exceptions
  File "/home/shevek/bitcoin/p2pool_2/p2pool.py", line 103, in check
    raise ValueError('not enough work!')
ValueError: not enough work!
Pages:
Jump to: