Author

Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool - page 225. (Read 2591920 times)

legendary
Activity: 1232
Merit: 1000
So, I've been running my BTC node using pypy for a while now & I'm very happy with it. Anyone who has alot of spare RAM - I'd highly recommend using it - much faster, less DOA/rejects & very stable  Grin

Still like to see it in C though.........

i have lotsa spare RAM ... need a guide, downloaded everything to get it running. but failed, maybe missed i something?

using windows here.

my node has been up for over 3 1/2 days 278 shares including 22 orphans & 2 dead, will be happy if i can get them to be lower.

click on sig to goto my node. any inputs are welcomed to improve it.

tia

Where is it located?
legendary
Activity: 1596
Merit: 1000
Speaking of pypy, I was wondering what the verdict was. I have plenty of RAM available. Is the switch fairly painless? I'd like to give it a go, but I also don't want to mess anything up with my Linux noobishness.
full member
Activity: 932
Merit: 100
arcs-chain.com
So, I've been running my BTC node using pypy for a while now & I'm very happy with it. Anyone who has alot of spare RAM - I'd highly recommend using it - much faster, less DOA/rejects & very stable  Grin

Still like to see it in C though.........

i have lotsa spare RAM ... need a guide, downloaded everything to get it running. but failed, maybe missed i something?

using windows here.

my node has been up for over 3 1/2 days 278 shares including 22 orphans & 2 dead, will be happy if i can get them to be lower.

click on sig to goto my node. any inputs are welcomed to improve it.

tia

Gentoo ebuild has low mem use flag, it might help with memory default disabled -low-mem..

[ebuild  N     ] dev-python/pypy-2.4.0:0/2.4  USE="bzip2 gdbm jit ncurses sqlite tk -doc -low-memory (-sandbox) -shadowstack" CPU_FLAGS_X86="(-sse2)

Have to test this asap

Is there a modified p2pool code for multiprocessing somewhere?

import multiprocessing at the beginning of code, or does it multi without modification too? Pypy does support it I guess.
legendary
Activity: 1500
Merit: 1002
Mine Mine Mine
So, I've been running my BTC node using pypy for a while now & I'm very happy with it. Anyone who has alot of spare RAM - I'd highly recommend using it - much faster, less DOA/rejects & very stable  Grin

Still like to see it in C though.........

i have lotsa spare RAM ... need a guide, downloaded everything to get it running. but failed, maybe missed i something?

using windows here.

my node has been up for over 3 1/2 days 278 shares including 22 orphans & 2 dead, will be happy if i can get them to be lower.

click on sig to goto my node. any inputs are welcomed to improve it.

tia
hero member
Activity: 686
Merit: 500
WANTED: Active dev to fix & re-write p2pool in C
So, I've been running my BTC node using pypy for a while now & I'm very happy with it. Anyone who has alot of spare RAM - I'd highly recommend using it - much faster, less DOA/rejects & very stable  Grin

Still like to see it in C though.........
legendary
Activity: 1232
Merit: 1000
I have another p2pool question. I am mining on  2 different nodes and I notice on both, the list of miners is always changing in which position my miners are on the list. Is this in someway tied to completed shares even if the shares don't meet the threshold of the difficulty to count as a p2pool share? I guess just wonder if there is a reason for the list to constantly be shuffling around?  
It's like solo mining but a lot lower difficulty.
The pseudo shares are only to show your hash rate.
They really mean nothing at all.
The real shares (no idea how many million diff it is at the moment) are on all nodes under the address the share was mined.

Aside:
So if you mine on multiple p2pool nodes with the same address, you'll see your expected payout as a sum of all shares under one sharechain address.
If you mine on multiple p2pool nodes with different addresses, then the payout will be the sum of the individual addresses - but the same value.
(Of course the 2nd option means that you'll end up with the payout broken up into smaller amounts which isn't very wise for anyone but a large miner)

Thanks Kano
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
I have another p2pool question. I am mining on  2 different nodes and I notice on both, the list of miners is always changing in which position my miners are on the list. Is this in someway tied to completed shares even if the shares don't meet the threshold of the difficulty to count as a p2pool share? I guess just wonder if there is a reason for the list to constantly be shuffling around?  
It's like solo mining but a lot lower difficulty.
The pseudo shares are only to show your hash rate.
They really mean nothing at all.
The real shares (no idea how many million diff it is at the moment) are on all nodes under the address the share was mined.

Aside:
So if you mine on multiple p2pool nodes with the same address, you'll see your expected payout as a sum of all shares under one sharechain address.
If you mine on multiple p2pool nodes with different addresses, then the payout will be the sum of the individual addresses - but the same value.
(Of course the 2nd option means that you'll end up with the payout broken up into smaller amounts which isn't very wise for anyone but a large miner)
legendary
Activity: 1232
Merit: 1000
I have another p2pool question. I am mining on  2 different nodes and I notice on both, the list of miners is always changing in which position my miners are on the list. Is this in someway tied to completed shares even if the shares don't meet the threshold of the difficulty to count as a p2pool share? I guess just wonder if there is a reason for the list to constantly be shuffling around?  
legendary
Activity: 1596
Merit: 1000
Has anyone updated their S4 to the latest firmware? It seems to have introduced a strange bug with p2pool. I'd like to figure out what's up.

Details start here: https://bitcointalksearch.org/topic/m.10340651

On the up side, stales / DOA are MUCH lower than on the previous firmware.
legendary
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
Gladly.  You can also do it the old fashioned way as well.  To see the list of addresses that would receive a portion of your donation, you do it right on your p2pool node.  For example, if you wish to donate 1BTC to p2pool miners you can see how that donation would be broken up by doing this on your node:

Code:
http://104.131.12.128:9332/patron_sendmany/1

(That's my backup node's IP, by the way).  The result is this:

Code:
{"12LhpP6NAM5VxmgDEeibkXHbCZJMMWchL6": 0.01118331, "1GKRR8GhV8wQEkLBKc6gVT8mMK1W9hso1X": 0.06319452,
"1HkAkTHGAxLKB8EqCh5H3sgbEpsjmR3sr6": 0.01349777, "1NastyBTCcxbAQQ22kEijbXqgyBPRujftL": 0.01181574,
"182xf7oE44H4G3NGSME76B3DGRLf7Ej1B6": 0.01602925, "1H2xBneyBRAYnHBCQ4MHfYYo9tNJ9B49Rf": 0.01023101,
"12q4Ysn7RaxMUsa8gzyvPxyCV9bJpiftuQ": 0.01282476, "13hcExuq3rxcvZkSuZDnLkHYU6zLxQ7MsX": 0.02063493,
"15x4opUdJJpVrkHJ4pGzWpVMvrodU3bvXu": 0.41801016, "1BnbYeLnz5Cpry6H7NRwgV5pofkEXokac8": 0.16763402,
"1GRg24ishi4HMC5hoPzKaTmN25yu9bAAax": 0.02163372, "1GUWNoSCnEf1wEiMx4RCGfvmYUYNCRh589": 0.08850434,
"1AckkfhXexaVqgg5Xc6dmidqmnKQRBbBw2": 0.01786333, "1Q3Tp2wCbVaELbtcaiZX5m4bAdafzrieT3": 0.0155312,
"1DbM7ukCgNCqN7HHkosGjrYkc5aL7mzutZ": 0.01992841, "1P2PooLUt1qsnqJgLeGsWN9P4qgie8HwG3": 0.0393303,
"1AzcWtoWWoUXigy7JZig5G6vCLKAwTDC8w": 0.01411288, "1MjeEv3WDgycrEaaNeSESrWvRfkU6s81TX": 0.03804035}

You can change the value of each portion of the donation by adding a second parameter to the call:

Code:
http://104.131.12.128:9332/patron_sendmany/1/.0001

That second parameter says, "Make the donation threshold 0.0001BTC".  Running that with my same 1BTC donation results in this:

Code:
{"1AckkfhXexaVqgg5Xc6dmidqmnKQRBbBw2": 0.01770819, "1MQb5UgG9gywtk1mbsamb2SUz7i8gKRMwi": 0.00140205,
"1PWKEza9kKJt7ysH9tBEyJnby4A7MbHSQy": 0.00060238, "15YN5CyHxu1JGDhDXuFmroqUdziPNZcTsQ": 0.00025779,
"1wTJSnYCTLQFAjUPi82Q8n53ouBZunVD3": 0.00173407, "13fTWKW3CMhhM5jv5Nc7vmaF9QdgJe9SoD": 0.00240476,
"1JFzaEzTk4ky1Ja8QFiKgQd7Jwsh4FLPJg": 0.00128848, "1KAWYV2yCf8jKnhbfAiSUgDpF9ZgrWDrTh": 0.0069349,
"1KcRjzsXpnSE8oZWtMsQccLrsQsdmr6sDC": 0.00010828, "1Jzx1WJFA4apYd3HXKNEvJy7L4BW4rCg7g": 0.00101653,
"14si1DPQiZW6UJj1RuqN4wubkTncirhrro": 0.00573829, "1Pexq51URrb88J7GhaATx8sZafvbiLAEnu": 0.00216242,
"12LhpP6NAM5VxmgDEeibkXHbCZJMMWchL6": 0.01118529, "1MjeEv3WDgycrEaaNeSESrWvRfkU6s81TX": 0.0380471,
"1GGTWZuiSgJneBVbfPNhYYqFh1792T98cT": 0.00126568, "18y7ru6FkFeamVMt5wwfge1FopKPtTzPzo": 0.00060877,
"1K9pZ3uorsAgbvEegXmYdbRBM8Vwkb1ev9": 0.00017402, "1BQsyPcbRRfjensocuaTfnBQNHL5vqVXgC": 0.00073387,
"1MuyEVptu27saxLsLhSmvwZn6pDBY2xGWP": 0.00073158, "18w5o227L68oNhAJCG1a5kpJ4ptfZN7Rx2": 0.00076171,
"1PiF3CWBW9hQaD7TrqQbN39iCcZNeCDp66": 0.00143308, "1KGYAwTYRFZkvwaJKNHraURMv3z5DyhPjW": 0.00015987,
"1KoWTk6WYewdEeeBDc5QHxEnYzDwi6khwJ": 0.00013716, "13BsxHKDbVg1haeEj1cPG3p9xm5p4EQEsA": 0.00280936,
"1A511TNe1ZEFiGSYGnZYSKdZ1WBwmq2i1o": 0.00032469, "1JUgwRQ7AwmjcavZcvaVBTQG8zXGM4geJF": 0.00075288,
"1BdTcMu4hoPKfpMpj4WD98RneCnZTnfurk": 0.00017802, "1o5fH4W7XwNhx2vdrNpwrYMXE9YRVHUEN": 0.00575196,
"1HkRFHBGuqbQhmxVNFaHTUN6ZRW2fVSssw": 0.0002971, "1ME6Yeqp2kZKxRS1wTRNnDLMsdWeEvGNPP": 0.00045273,
"1LgmrqQDB6ivTbXgFDzyjvBeyZZTKnxgG9": 0.00054954, "17s64G3ZcYyLLyvo3sLZnNbBfV3p4AvmJR": 0.00388727,
"1M3tf4bf6xNsU2enpC3REDk2hh8p2AoezG": 0.00321995, "1HE8HsNdSg7SftTYfZMuv9Z5HjX7yNa5DX": 0.00086592,
"1B1m6KyANNeN7BL81PrhcZ5Qv28JAYJx2T": 0.00029611, "1A6g3iTykXabmuQkVAgueyWBV6J1pSNCEM": 0.00104252,
"1Kz5QaUPDtKrj5SqW5tFkn7WZh8LmQaQi4": 0.00070586, "19XvUuoqo3z7NsWACvA9TKptASZZAyYQ4g": 0.0006881,
"1Kt4GRQCTfXELuvyXuwGTSYcCXm3rQ5xkd": 0.0042189, "12DEeUzReTxEpuRo7FLN6XJxHVDveZfUPJ": 0.00016266,
"15k6pRG84XCbvEMVk1v8TQC7HoykFZFtxX": 0.00117358, "15sK72HFYEuxhks876eSQdwVhf5xssxTMY": 0.00085206,
"1C2JwstTfUyj32yKv5N2LDu7DXxrJjMafg": 0.00082838, "13JFmjThCK7Fgp2RGrYtBNNwW9bW1CxKQ7": 0.00045662,
"1PByWhUBiN7K1ya1zDE2QPeVws13TyHn3L": 0.00063604, "149tNVGRZUWjLnLP1aKL9gp2kf7i7VmfYW": 0.00034851,
"1NZGrfZ98HW91mxGFxBKKKdDFLJhpT1Gjg": 0.00097647, "1CP7c8J5QeUhjgSXdfmL91UWJtuLqHw2FS": 0.00032459,
"1K4pKZ7qfFF8wMGq1Q7v6yYuny5BbgCPDW": 0.0005281, "18fLQJ3XhBcEFBmNwj6YYoQV6D8vgM6Xfd": 0.00411854,
"1GYt7iVjXkskyfyT3BCUhHBbwHtWQPzmeg": 0.00015902, "1RuMadLoiSLWy5PyipzvKK1mVjHjB6gAW": 0.00604022,
"1Cbv2XefAmqVPjneeQydP6USsMyy3j4wdo": 0.00090011, "1FRiKqxCSZV3AXvcV4b1ZNNrDhNzzPomJ4": 0.00100894,
"1FPkPmQrTnWykaGvpEg3Q9g4FXEthnqqYH": 0.00054789, "14XXBCJUr9E7svMSMQ1ppZ2pFC39x4YFSS": 0.00018077,
"19G2n3t1kD3LMQHDp9jFq3e829Mc4bWDQ9": 0.00123558, "139RbcC322Wcy9ibQSMNcXbrihq9Cdrhnk": 0.00016389,
"182xf7oE44H4G3NGSME76B3DGRLf7Ej1B6": 0.0160321, "16taCMamAnYkhgiRLztHppqjoFBs881rSM": 0.00258894,
"1Fmf5jy1WN8CcCBEPpwZUgvByfPhRhqASo": 0.00337247, "1AqNu9B5kcJEUEnQejargPn3rL4JXGbre4": 0.00043041,
"1BEszr9btZ6H7iYMLV3HfGnnuZon79Gxrx": 0.00034394, "1Jo5yeaoQt7dcpuuny1gxcyxLaYyqKjUbY": 0.0008949,
"16GSosqeCKfuPjRjZKPSqLR4TMBR9T7nYE": 0.00035444, "1Q7NFKSbjzXHAQ9uxyAVYnKcVZKuQ9E4z1": 0.00033883,
"1BwAJ1Lh8yNHA98aYDJxkirmonQvmZqAhn": 0.00158366, "1LFf8LNk8qBs7baedMQRWnS8SnvuwUP8Rf": 0.00071906,
"1D28uA7g9XghY8R1KyM3nEB7ttmJG8JGuA": 0.00119885, "1A6uKUk5QdRzrRcPjmBu1zrKSTdftxaTSa": 0.00053389,
"1PoZHY9PSrBNo2P1fmoyaatFzdy1UiZaFC": 0.00412268, "1MpG1ZGQu4gmoQ1aRR5ASRovfoMNwzeeyY": 0.00112255,
"1MovshMf6v313RtWtSeoKfqMc1PwT1nY1V": 0.00035341, "1DreA5o9JB3QtSrbHuUCrHy3aRRGZYkusL": 0.00010916,
"13197Yrpm15iwVMhi4HMHwGp8wTwaPXVtp": 0.00192223, "1N9r182ZzpN8xHZevphNydVQ5jwyqeRJnB": 0.0013804,
"1CL4xWU4Ht36WTJb1nidnnQCKMsYqvsby9": 0.00054171, "12fLDusUFtUF6s1eUJ3UBdT8gi31WmD3BQ": 0.00027274,
"1PRdEV1SYrzgqQagfhVdHBEWHRFmYaDF6j": 0.00042458, "1Lnrm4euRRd865ZvM6mmHTvG8mCspgDKmL": 0.00205253,
"1DWYqM5YN6NC13JEcBq29tfozNkEFrHNTv": 0.00373541, "1J9cyuFmkhkYqG2WMqDBXg9LPTBv7xDvgj": 0.00074732,
"1B9Cdt95NW1CgjpogtCzHbAukQf5d33rEQ": 0.00274457, "1PiimqbMpv6hkWz4yyQkSmBagtmjHvfj5r": 0.00011053,
"1HQsRNrx9X27Q6JCAcnF69mPkUoAXqmdMj": 0.00032794, "1K3pJaWeT1tBNTToXD8cZ9g75a3y8nMVKh": 0.00903162,
"1LcPMyQUVLKN6FvLJkj8CfjdZJrUawRSdp": 0.00017183, "16Qiv3LrTY7nEUdK9BKJ8vwuqvJrGY6ETF": 0.00016217,
"19bLx2PJmfuqyASMUJ2NXLdx9ofBfzaW65": 0.00093776, "1AYiciXpfGGWdE9smuKnzioaeyttsYEAoF": 0.00044199,
"1L7qpeL5KqKVGSUnDKwNBa5CWDDM1oaeSH": 0.00118938, "1HZd4uoLW9gibTVC6FxxPrWbwrUA9UsgWR": 0.0009105,
"14PRSXKxB4ZtKwbvXNsJe1BZq7JTDPWDdN": 0.00106833, "1CzLSNz7V6EufECrjvMXBiVo8Tu7z3XgkF": 0.00053587,
"1FHmBsCFz9uVaQFbnAjSreX7PkkW1N3rvA": 0.00065521, "1BnbYeLnz5Cpry6H7NRwgV5pofkEXokac8": 0.16774504,
"14TwSxSXjnrHDCEqzyqT2aDmZdncNNqsom": 0.00030531, "1NJJgrGhU7rTFCgD9ZbfktKoyj4vFHe4yB": 0.00491943,
"16qujMM7zj5RjDMLn8HTmEQ5CY1XtQ2MJX": 0.00010687, "12TdcXJkxc34kaajvMCbp9w6gfAoYEVntG": 0.00126339,
"1NGtckWb8Ad7BnUgfUpVam14THozPPwaCB": 0.00017094, "1SaturneKrPDwctdME1KoJdT3i5Xhgugt": 0.00014481,
"1papazx2iBNxXwmDtecssrcRt9pYxjTBU": 0.00036556, "1H1jEkQ2N9rCKHMb794nv9KAyKePKNqRZj": 0.00099977,
"1jmMHeQsYknWsWGXzxYLZKynBjywG2SdD": 0.00113199, "1C8JU5LGeQpo3XzVPek5BAdMVs6WUmsUNz": 0.00019229,
"1A18J2obr5ucPoLpDoRB7WykTaDjGFEH2w": 0.00370375, "1JjpZkNS9UvC8mFCxjX2DWFJrprpdjhJqY": 0.00034047,
"16B2MjWLkjYAk2rT38JHvMQyfP6nosYR8t": 0.00010877, "1KnFzhex3TLCDZT1bpdFjUgn35Lkjy1HW7": 0.00033221,
"1FfcZsAKZLivLLT5jAu6Jr3vCmp9aNtoqF": 0.00018819, "17KyYdxmjoGTthA9GXxDcQhu4R82fzUs9Q": 0.00016987,
"1DjNJXATwLMcUqs1UCTBbk6m9awdtqwzWS": 0.00157133, "1BJSrJKJSCCyDXmosTrCNyW3ZkyVFMtvXy": 0.00010515,
"1JdNrVzKu8wbGqLXUPHjszBwLc3zrEuR8s": 0.00255231, "1DANjnr1nmicNCE5SKyh2KrgHogUN9KVHR": 0.00028865,
"1AAMc78N2EJHpanFgUHDdMH93afPFuWGtG": 0.00121901, "14hqYFqU9B8FzkQboDPjKsiyPs1B8Wf1ZB": 0.00094249,
"1CqJejX5bahdHaNUN4x35kPoLNgZ4LPCyC": 0.00068937, "1Jk7W1btqU7yrQ9mhmutuLShigFxinAtrA": 0.00071137,
"12epKX9mCw9PAa5HpqeTANfpw8VRWRZ5BQ": 0.00096301, "198AiWKZS68sQgTRQdwPpyWntbAs2spwiQ": 0.00113682,
"1NdEEsdN8ZuPKowxdjuBqAgtWkch8Cv3FT": 0.00021524, "1727qEJfMinkPjT6ZyXcCGT1jdjcYiXfFT": 0.00027305,
"19GeXf1BRxA6KUtCkaYj4zzPURhBMBPo5Z": 0.0002001, "14cwTw2VWVwGZGG5LNYsE8gQirvHSSibXj": 0.00197266,
"1BXepkA36znUXtaJLpsdbztSB3FXGWmpoL": 0.00010787, "1PoDkbNbXfNAyaeHGmzBcPbyK4ToCV9yEj": 0.00073816,
"19oPixBTg6pmmEge3EpW1mTh71eScAeHv2": 0.00235417, "18xYcXUtTmNoWr2G7itjp5vffy2cg9JDRZ": 0.00083797,
"13HtJCMNFnF674JJv9FC2JZbLM3E3oDYFT": 0.00409341, "1NKASTF3cmt67CkkMTvFAeZ4gLjWUzFfK7": 0.000456,
"1MgWmZUFEm5GiwnNGMip95yTRsNzCrwq7A": 0.00015455, "18oWaYxFygXARdpAChmDKjoMwcR42F3sXY": 0.00015566,
"15SnSdzAxiFUsABuBW2mvpWktG5ywwFDLb": 0.0046096, "1MCd7xLBUoM9hS4qJXogciSrGGxMFrPF6D": 0.0006621,
"16sZ871skVNDDFPoEvec5HoqqKgSYqjQKP": 0.00073139, "1papaEdQiRxySPi373jK46h28LHiHZHgj": 0.00061766,
"1AXWt14VzVpqSnwMCJqAVJ4KrUNBqaAKKv": 0.00052205, "1MmRrtfo5JzSy5ZKgM4oQYosBhiqmHajVc": 0.00019566,
"1GKRR8GhV8wQEkLBKc6gVT8mMK1W9hso1X": 0.06320574, "1Hg66Bo8bMt6r7uACcqEEirJAVRxP7fi7X": 0.00051491,
"1D3T7W256jBbCFyZ7rnnUqiRrR2oJHL8Vp": 0.00026155, "1MFEzWXrQ6EC5Y8p7tyN3uYdQJKPeeW1ZD": 0.00025706,
"1DepAgYBKyVLbeTJs3iW9zrcuts3GYwemB": 0.00096651, "1Km8nYGy814uNasGCwB6X3q82jGE1aFRh9": 0.00050812,
"1M5bshwMYyxG5ftYMzi1AR6kRH4muZ8n6g": 0.00035647, "1papabzG5mfqJGfaXiRFM2PchLnZkwWGt": 0.00062361,
"1Ed3VHtbj6sSN1qXvAQ7o8EgJivUWMcKEF": 0.00067734, "1Zv1SUeraoCJfWSPg2bdgjo417JHgdxEE": 0.00143765,
"12zkF7Y49U1g6QeWwpWCuwojXBQM63GwV9": 0.0060608, "1JWVKjktRku67guGXNWXZr6AewqoQMDu3k": 0.0002512,
"1HCk6GgG7eVphLjhUD2udBqaws2abLeM75": 0.00061593, "19wD8QdfCG5xc9sryH88B4xo26rDt5ik6U": 0.00096635,
"13hTxt88XAu3aTJEKdomjvkYqbnsYR7Ygq": 0.00340178, "1BdTuu7fkAvhnbqCjhhmDcwXGMBWpPZssD": 0.00074471,
"1GMH2nJ7sr9ng88fB1aRNrsBBdaMffEtpB": 0.00061488, "1GFefmJUzw9EjNnQKMgdiXAJfqYf48XUqT": 0.00189097,
"1KBX5hrBz1VyqRf5L2r8BpjZ8UWNdogRj1": 0.00035305, "1771rKYRkhQ2t5X1HQ8a9T8DCMEAfTQcxj": 0.00176134,
"1NastyBTCcxbAQQ22kEijbXqgyBPRujftL": 0.01181784, "1CNuACawJVzBsPhqisvSV1Yt7dPK8kaMuB": 0.0058553,
"19BeQnt4BXNXe2H4vAg3gfCaUUMiZaDrgV": 0.00010707, "13hcExuq3rxcvZkSuZDnLkHYU6zLxQ7MsX": 0.02063859,
"18zTL9s3wX5TGttoPA4Agpf6dEp3zytG5e": 0.00044524, "1LfbhbM7uV1fkSMt9aKAztMpNbhJARNNGx": 0.00092235,
"1MN8MqtrzEk8gnfRhXgjKCChNuhqYsKz5P": 0.00106799, "1DbM7ukCgNCqN7HHkosGjrYkc5aL7mzutZ": 0.01993195,
"1EJsZuDzZTtjDAtinQV3UuBg5QLot3PmYN": 0.00018919, "1HKFFBaYqhm163j2SHRHzgK27tCJWa9AHY": 0.00028018,
"1FG12Ci6a8gJSAaRUnez56wAEQt9p3gsBE": 0.00062288, "19ezRxrsTzXBY6ZV6Bd14Vr6en157V2LKv": 0.00060814,
"1JpiEKt9EyQBtSUDGfLqaE5RrcGj1GNimP": 0.00206969, "1MSL6jsddZWrFFTieJNXAmn686FySJXaA1": 0.00067792,
"1JXvWAQNmW9Fnm56XgHzaWUMXgnHiQKPvQ": 0.00017661, "12MyfbytdEY9zMF9ReGdFpsjfAjxXYyMAY": 0.00084273,
"1819SwKxS4hiM8yeiSbwMdSrWh1qTgAfau": 0.00037145, "1JSUhT47iuRh4JJK8kiZs8wVgH1qaqp4Th": 0.00029107,
"12hpFbHUiz5jtUdgywqz8xhWSCcA6vVzk6": 0.0027863, "15x4opUdJJpVrkHJ4pGzWpVMvrodU3bvXu": 0.0024555,
"1NuMyu9rGG4okLUWzNv6eBMZyYerF2ThqS": 0.00286445, "1LMtpu8pKBL6kQJuUSJkfYPHZNHF3C4PPa": 0.0003701,
"19tVWpc6gJUF3MRysf8D6pFkKXc2BPK1d6": 0.00016895, "13nZ7EZTHk5ZQbPn5jVW3PNamEGcKC9yJ9": 0.0002761,
"1PYmPYcWwYPdzMYR6xTqcrbtGfSQVYUJSN": 0.00129233, "18G6irXaVmScKS3fK1dMi3cMWcVjRLbdo6": 0.00035786,
"1C3wm866B4GQYcjPiiExyaNVXdSG7DjvWv": 0.00034288, "1GUWNoSCnEf1wEiMx4RCGfvmYUYNCRh589": 0.08852005,
"1P2pooLvNQNFDMdNUk96vgZWNC7HfCoZip": 0.0024158, "1C8w4jcZWtiM8ShBAM1VZi4nUNxWSneoWR": 0.00043847,
"1G15vh3swmt1LDvYKE9JCYd7trVbJTwAJa": 0.00044636, "1MASuLjBmABLuatLhQ4PneWzd6EUcxRSH1": 0.00342922,
"1PuEtn8MzGbF58r1KLXwAnyvtq5g2FLZsw": 0.00025533, "1HpDmczvyiUPp1j2Exya4GiftUyFKuBgu": 0.00049881,
"12eMAu5ANDqwTVDaw8VehrGYGX8zKnTwWb": 0.00017235, "1Arq5bpiPNJhrsYKrsUBXUkMafZitU8geQ": 0.00144869,
"13Xs2wtcprLFPKxiSwyGiYqWq1Su1Au3Zv": 0.00095137, "1KzbEWZBty6rBRvFTckuFCxM3cP4gZrcfJ": 0.00385196,
"1DKqAZ2MjJe3LxDpgXWChu31pApgjz6Syr": 0.00602296, "1LpctwkcivACZbFh4yZzQDRcGpGL7f4B6M": 0.00070356,
"1Dyv9krFrLUPfsjzbaCpac8QoTKEWsYQAV": 0.00052912, "15VYFafLDqfVkkqeMWvDc1RsqqpY9tr4cS": 0.0005387,
"14vKL6HyxQHUu9en1ZkXrBPS8hRWbK1GvW": 0.00018286, "12cEUnawxYqobA2tQDXBx2SjfLxC7X5MJq": 0.00067767,
"13fAHCKwgstdWMd4gsPmFkR2oRwHU7X7JZ": 0.00233183, "1LLG6xQgoBzhFnDbJTRe6M4QkXtAGVTqB5": 0.00294158,
"19ckHdvxvLi5eZeoRHAKUcGV3fqSZZuHew": 0.00401621, "1Mag5XQcm81yP3rz8qJvXbU1KSWypKmPur": 0.00043059,
"17igiE1LjcvV4KgtGviHWUfjQJpkShXHLc": 0.00043882, "1NpPc3W4hegDMEch4Fzd4nRTvRUyXbX18a": 0.00164689,
"13HPyx1B67C9KTrbHTEeJBYCVE2bhTnBnE": 0.00119102, "142YScfpKHD8V6Uis7rKJxkzC8hFiQTAaj": 0.00011009,
"1Q3Tp2wCbVaELbtcaiZX5m4bAdafzrieT3": 0.01553395, "1DkHFvYg7QN6phJeLdJ2a5iH9rfUrS8ATv": 0.00077589,
"1Daz4CFGgMBqakiU3CsBRqqn7MKHCE3XXQ": 0.00018799, "1EtyvMPQy8NtdTrJfGvHLQ1zWLkWotzfmF": 0.00103948,
"1HkAkTHGAxLKB8EqCh5H3sgbEpsjmR3sr6": 0.01324461, "1AncuRYTgyLveyq7CseNWkudomPaesjh5L": 0.00035915,
"1FgZMd7jiHpTkRSNzwzG7a7FsZPi1LoP9a": 0.00013869, "1AzcWtoWWoUXigy7JZig5G6vCLKAwTDC8w": 0.01411539,
"1BerndE6ZkNBWPXA6JfM78Xkz84KRCRYk3": 0.00041945, "1EtKHggZDGFVWxusL5UViZkbyJ1n7bMZHx": 0.00114515,
"1JekkWUvbZKwcbSZWEnSxqaxNerGsioZFh": 0.00044815, "1G636XeYB3rVv9XaScpgSwFxoDDRcYKiHJ": 0.00032232,
"1D7apk4pMDZYQftPGrihHwmc5FpeHe8rjj": 0.00078418, "1AAfteM9YeJmUe9zKkdrtaohwx9mJWbCiq": 0.00010642,
"1K5XU2rS4owqFnf8Z6cwr5SwBMBfgt8wku": 0.00976464, "1B2opUJXBTTGZj17z2CxdZFLhU4FABWJ5v": 0.00029479,
"12ifD3JByzpgr5Ha38DfWMty4eYxRLd4kf": 0.00027922, "17HA1G9R4WMbBRQyCqXKiijZi1mhqEj5sD": 0.00175274,
"1CWkjm4CbjhYiaPkjJz9AvKJzfph445Ges": 0.0003575, "1KHVSMwoQPuA4iZBXAjHqZ8gzWEF3My3eG": 0.00176617,
"18a5PBWSQwdSt3Z8AepBpES2czZ9ioTQPd": 0.00598588, "1sRZ5tCsaNB971vyK8dNDcdZc5tJeMgdy": 0.00095792,
"1DZktf2SKTCbdhZg1iNHCT3CiqxXFVErAi": 0.00186907, "1D45Lz2rCK3tFwFRh5okQcKkSNCLXTWgaa": 0.00103685,
"1Too1yh8KZ5fmYhXa822ZrSME8vXmyJmU": 0.00022489, "1KT2ceAEoMnpHMgRCoSW39teFQMBv4vebs": 0.00459533,
"1819GYv9smmrd8m8Mhty82UR52dyzPb9FF": 0.00226015, "1MX63q78Q9GwKnMrDV6aB6xo73Ane4N29S": 0.0002646,
"1DVj9zZdcgHYh47EvR6MnYY5BkYcvgU7JE": 0.00135946, "1Dv8Rs3V1856XU6HtB5B3brxbgac18TBWN": 0.00066098,
"17yZrHFp6aPy6Fx28fgXQwAgC3p14WXW1o": 0.00349491, "19QXV3uA7uYEr41NGzpaJfYzQyUp7qnwaS": 0.00080066,
"1E7ciGtZ3s36aaytgFtkdkfjHYeLrUrrT2": 0.00406361, "1RjkG6qFq7qDTJS8xP3kbyAQ1TLXnj68k": 0.000174,
"15CpU7m11cazNCwR4rcK7zVyyJxp785GQ3": 0.00020892, "1PZMEaqvvteCYtEW2XvMTV7g4b9Now4ArX": 0.00037676,
"1HASS7upxMixmxqgNrjdbv9DrHF1edbMzT": 0.00601828, "1He61cR7zHHjZNMVv1zSv6msvDHipE5bSc": 0.00018703,
"1EbXUVuXZ7zB2SGRTu2dmFB26xjVW63sAD": 0.00053376, "17jb3su7vb5epVzCjwYzeEuFL3w6dvPEts": 0.00036247,
"16StLtJtJ7Foof5HVitHLZMBjA1wH1qKfU": 0.00023672, "1MjkWPD6SFjj1yBT6Mbx7GLT6XvtaAixin": 0.00473185,
"1P2PooLUt1qsnqJgLeGsWN9P4qgie8HwG3": 0.03933728, "1G9gGDVEGHRH4Wfa6kQSZmqAeta8fAw4Yb": 0.0003708,
"13YSQDAepvH77xe51zYgADTb5KfnBcoZkF": 0.00182724, "1NyKGFAK4e33wNYSXcZzjFvRwXEZ9NLBNM": 0.00062746,
"1QAsYoJWZw3JjcwtP6Aac6LiNyjvWhQJSw": 0.00107255, "1BA7n92mEn5d6WDcfHpBLQEf4b6ZPTKSaJ": 0.00218987,
"1MWSaiLQ5Y2EmaV1ydjMmw6MFphKdUMByH": 0.0056201, "1Ksm4J2y4oLWhCJUwZe6DaZzduJtQWLCqV": 0.00018408,
"1MjX3vAXeER3rXZiLWADtobXmDK41VUdQr": 0.00062831, "1P3nbZEC17ARnzf3Gjf4bW4Qwf26grjHxh": 0.00082554,
"1H2xBneyBRAYnHBCQ4MHfYYo9tNJ9B49Rf": 0.01023283, "1MeZ28t6LgKkQA44WAa6YCYda6p4h2LD7Z": 0.00353968,
"12qs7jQ2ZDHKxeZRNVSEpXHd3tdQpJjw9S": 0.00323006, "1GEvGnXQbALcjvw6Ed4Xek6f3wyCgXQRfb": 0.00748651,
"1JxBQVsdvMU2Qf2DKNULA6BdBZtimogPeJ": 0.00019033, "15yc8FCoKGZ5YK5ev4mWfzfZXPPriVmFCr": 0.00123264,
"159jrQVAS8U7FqnCrid7Y51LaPABErd2Us": 0.0025922, "15cCJHQz88neRftSoXfBWG5n9HBmm7CfHW": 0.00543536,
"1AV24hjDkkSSFdcioPEB9ooUPMLi64GMi": 0.0006442, "1Lsz7MzjborCrQwKhhrdGhWpdiRdSxx1mc": 0.00054842,
"1Gisaq9cBpVKrt9cKmtTT56VQkkf8rZPa2": 0.00010199, "14ARAH3LM5dYfdNogdoUMUaQpGscZy9jnh": 0.00045322,
"1GNwJ9jrexJh9AEoi1DHnW1UdAjH52bMTY": 0.00070938, "1MgjQN5soLfGV8VhL1geDLuqWuswUErSy4": 0.00064757,
"1iLtAUHRsVNVg56FzxNdb7vswpiUfhT9f": 0.00178964, "1ATqSasDDvvhrfGUt77UAc1pqUVcg5EVDk": 0.00104268,
"1E8PdS3dHnYBnd982WqQkgSVGd7AZkVHZd": 0.00034973, "1i7abzojtm2Z5z5o1mXkxGhjj6cxQ67eN": 0.00028715,
"1KqEzwR7CLnaRJBx8xYJktfvsE4Ce2vsCF": 0.00017821, "19B5zVCXskc8X8BJwAh5cxKz2PwtDWbt27": 0.00017577,
"1JRnuK94AMbsBoiyh2JPX7vYsF5yhL9i4r": 0.0034497, "1KkuXnDg62HTz5HMbDMJmyr5H2fk8sgQ5m": 0.00403831,
"1UFCfwMrFRNURmGNidHev6jKnexNproR7": 0.0042316, "1HzsPcUVMsewDN3VAmaZensaotbGqmDZs2": 0.00047227,
"1DUr6S1DjwznT2bVST6D3b717NrMxxoqgV": 0.00098516, "19swobnQME15Yrj9wxBiqfDMc9uZ821HAf": 0.0001703,
"1EAjioEi4L8xdYtijmskmxdH2v7wZ62JTS": 0.0010162, "1GRg24ishi4HMC5hoPzKaTmN25yu9bAAax": 0.02163756,
"1JtweHBwbdbVJCBsSoqdjbuCyhCt3XJbuG": 0.00061011, "17MYgn3UVHPCkHPK9VchVRvoHWMPfCXT3B": 0.00805905,
"1MNqjB7pPFUXmt7H7SJWtNrXWNTePsgnjP": 0.00034808, "174h9e9uieKd78QPZZEmw7njzoTR6GgCq9": 0.00041827,
"1NFV1FtArLZxeYC8RMhL1oPjXg2rhGaFHL": 0.00265034, "1C2qwSWPZpT4wtxsytyDrD6GsA6uxThUBp": 0.00402423,
"19ErGvo8Z7eDbJu7fehdz6GcbT438Kixz9": 0.00018352, "1DeVLDoGvkbbB5n3dPvbpDbwiKGjYckCy9": 0.00110977,
"1MH94TYJBMUkRoK1DgvGyCbkSiKiZziiCG": 0.00028878, "1GpTXa6GJSjxcAD8Rro3feqfmrp77xhKoN": 0.00106991,
"1JngVowJ4MUdjSijNH6P6PFHhMRu6cEk62": 0.00091842, "1jWD18EQTc9jmUhsqdSuWPmMyjHqWVWKS": 0.00035231,
"1DPB9mHthfF8y2od7QMPjny1p3DhVdpaxA": 0.00243982, "1CWVMfUgNEnHVcpH8P69yWYgpvqy5ovDnJ": 0.00022229,
"1Jh9HgKuFjRjxHxuUdBcu9Wfq3ZLxN33nt": 0.00249673, "1PB2pp3gF8AYxm3ySP3ugJHk3L1TBaB8Wu": 0.00086399,
"1Erjfqnp7GAh2BzaFJREHvYfGUJCtEvq52": 0.00049159, "15mfrrfX9MXrm4oJMJzHHG1CgXomsTAXU5": 0.00025661,
"1EBTNsjEuw3LJsrT3PJ7G6DmDr3qYgMUra": 0.00019656, "15PMMHKupFQCwRZ1zWFfTMkPNx37T6pdiy": 0.00079046,
"1Gv31PgUAF4Tz3R3W1A2346nxi2RAbmGDM": 0.00173544, "12q4Ysn7RaxMUsa8gzyvPxyCV9bJpiftuQ": 0.01282703,
"15d9u5AFbgpkox98xRnZrtcxBiDJXuBiVP": 0.00025258, "1EjBFDztyfjrSBcAdQWf3aMEDnwNqZCTFF": 0.00103747,
"16NKTzhrMmAxjhAm5mDmcehSq8KxZEcZcF": 0.00319725, "1GWq4krFdEpP8HfFB93ofvFYvw6jhsqAbi": 0.00136674,
"1M9kf75v6Sow6vQiJ1j3R1Hj7wnBAywAUm": 0.00111406, "1EHuVLnzgELUpP1QWuje3xu7mtjLAq3fiH": 0.00266452,
"1MWzXJgsi33EANFsRfnTghasLB1Sn4NGgd": 0.00297038, "1EmsQXNdnYrhrsRPgmSaFnWkQDHmpcWeyp": 0.00375808,
"16so8YrP4PkFX5uEn77cbykhDfQHk6yikX": 0.00177646, "1KZ7ZyRHsFegiRkktH7MtZwk9AztKDhX4i": 0.00076744,
"1KNfu9ufqKR9mhyGa6eXq7Lp2XeCzTJmxo": 0.00045036}

As you can see, it's a considerably longer list.

You would then use that output as input to your bitcoin client and use the "sendmany" command.  Assuming you're on *nix:

Code:
bitcoind sendmany "" "$(wget -O- http://104.131.12.128:9332/patron_sendmany/1/.0001)"

EDIT:

One thing I want to point out.  if you just put the http://104.131.12.128:9332/patron_sendmany/1/.0001 in your browser, you WILL NOT actually send a donation.  You are just given a list of addresses and how much each address would get.

If you put in the command to bitcoind, YOU WILL send out a donation.  Don't accidentally put that command in your bitcoin client Smiley
legendary
Activity: 1232
Merit: 1000
I have a p2pool payment question. After the blocks were found yesterday I got paid both times then later in the day I received a couple very small payments? Not sure what the small payments were for?
Those are donations.  Some generous souls decided to spread the love.  Whenever my node hits a block of NMC, I convert it to BTC and send the entire proceeds out as a donation.  Since I only have about 4TH/s, it's rare (I've hit exactly two blocks of NMC, one when I found a block of BTC last April and another a month or so ago).  You, too, can donate to p2pool miners if you wish.  Hunterbunter put together a nice web interface for doing it: http://blisterpool.com/p2pdonate.

Thanks for explaining.
donator
Activity: 4760
Merit: 4323
Leading Crypto Sports Betting & Casino Platform
You, too, can donate to p2pool miners if you wish.  Hunterbunter put together a nice web interface for doing it: http://blisterpool.com/p2pdonate.

I was unaware of this.  Thanks for sharing.
legendary
Activity: 1344
Merit: 1024
Mine at Jonny's Pool
I have a p2pool payment question. After the blocks were found yesterday I got paid both times then later in the day I received a couple very small payments? Not sure what the small payments were for?
Those are donations.  Some generous souls decided to spread the love.  Whenever my node hits a block of NMC, I convert it to BTC and send the entire proceeds out as a donation.  Since I only have about 4TH/s, it's rare (I've hit exactly two blocks of NMC, one when I found a block of BTC last April and another a month or so ago).  You, too, can donate to p2pool miners if you wish.  Hunterbunter put together a nice web interface for doing it: http://blisterpool.com/p2pdonate.
legendary
Activity: 1232
Merit: 1000
I have a p2pool payment question. After the blocks were found yesterday I got paid both times then later in the day I received a couple very small payments? Not sure what the small payments were for?
member
Activity: 78
Merit: 10
Well when recoding p2pool, It would be Nice to solve the "payload too long" issue as well.

Does somebody know how to deal with this?  Huh  

Please check this post: https://bitcointalksearch.org/topic/m.10164796

 Roll Eyes

So doing some digging, the "payload too long" error comes about when a packet's length is greater than max_payload_length.  This can be seen in p2pool/util/p2protocol.py.

This is called from p2pool/bitcoin/p2p.py and p2pool/p2p.py where it sets max_payload_length to 1,000,000.

I am going to change the code to print out the packet size, as well as, change the max_payload_length to twice the size.  I will let the code run for awhile and see what happens and report back.


If I'm correct the 'maxblocksize' is hardcoded to 1MB into the Bitcoin protocol. So I guess changing the payload length does do the trick within P2Pool code (until the blocksize will be increased). Well it is still unclear to me what part of the code you have changed. Logically it had to do with Bitcoin Core which is responsible for "packaging" the block and verify the hash of it. It would be awesome if you still could update us with news. I've also updated my Bitcoin Core and I'm waiting for a long blocktime with lots of transactions to debug my new setup. So, I'll keep you guys posted! Thanks!



Yes, the block should not be coming into p2pool over 1,000,000 or perhaps it is sometimes coming in with a size of 1,024,000 (another meaning of MB).  Or perhaps p2pool or bitcoind is somehow corrupting it at certain times which is causing this issue.  We'll see and my logging should let us know a little better what is happening.


I am running two p2pool servers.  One crashed and one didn't (they usually crash at roughly the same time over this payload issue).  One node however had increased max_payload_length. 

In the logs it states:

Quote
p2pool.log:2015-01-24 13:55:07.867204 PAYLOAD sharereply 1088513
p2pool.log:2015-01-24 13:55:27.520303 PAYLOAD sharereply 1080454

The number represents the size of the payload which is over 1,000,000 and would of crashed my node.  So by changing maxpayload in the p2p.py files I resolved this issue (for now).

The lines I changed are in p2p.py
Quote
        p2protocol.Protocol.__init__(self, node.net.PREFIX, 2000000, node.traffic_happened)
and bitcoin/p2p.py
Quote
        p2protocol.Protocol.__init__(self, net.P2P_PREFIX, 2000000, ignore_trailing_payload=True)
legendary
Activity: 1540
Merit: 1001
A few things about that ...

Rejecting shares that are worth nothing really should not be of concern Smiley
You have to send a XMillion share before it counts as anything at all ... and the rejections you are talking about is WAY below that and wont affect sharechain shares that are actually worth something.

Yup, see my addendum #1 above. Smiley

Quote
Secondly, you'd need to check what the message is that p2pool is sending.
The protocol allows p2pool to tell the miner to discard all work and use the new diff, and the reverse to mean that finish your current work then move on to the new diff.
If the protocol messages aren't telling the miner to discard work and immediately move on to the new diff, then that's a bug in p2pool rejecting them.
Again, you lose nothing coz they are worthless, but it shouldn't reject them in this scenario.

It's been a while, I don't recall the exact verbiage.  I do remember the sequence of events, however, and it's exactly as I described above.  p2pool says "difficulty change!"  and up to 10 seconds later the Ant submits work with the old difficulty and it gets rejected.  There wasn't a "work restart" request in between either.  This was testing with my S2 long before Bitmain significantly improved performance of the S2s on p2pool, so it's probably 5 seconds or so now.  And, on the p2pool side, I clearly saw the "worker X submitted share with too high difficulty, expected X (received Y)" or something like that.

Quote
If the protocol is telling the miner to discard work, and you have "submit-stale" on (which is on by default and ALL p2pool miners must have it on) then it really doesn't matter at all since the miner is simply submitting stale shares as is required for p2pool to avoid throwing away valid bitcoin blocks - which would be a REALLY bad thing to do and not have "submit-stale" enabled Tongue
Though some of the bitmain firmware does this in their version of the driver - discard valid blocks that p2pool says are stale - but you'd have to check their code for the driver you are using if it's not master cgminer.

I no longer have any Ants in my possession, so I'm not able to look at this again to confirm. Sad

M
legendary
Activity: 4592
Merit: 1851
Linux since 1997 RedHat 4
A few things about that ...

Rejecting shares that are worth nothing really should not be of concern Smiley
You have to send a XMillion share before it counts as anything at all ... and the rejections you are talking about is WAY below that and wont affect sharechain shares that are actually worth something.

Secondly, you'd need to check what the message is that p2pool is sending.
The protocol allows p2pool to tell the miner to discard all work and use the new diff, and the reverse to mean that finish your current work then move on to the new diff.
If the protocol messages aren't telling the miner to discard work and immediately move on to the new diff, then that's a bug in p2pool rejecting them.
Again, you lose nothing coz they are worthless, but it shouldn't reject them in this scenario.

If the protocol is telling the miner to discard work, and you have "submit-stale" on (which is on by default and ALL p2pool miners must have it on) then it really doesn't matter at all since the miner is simply submitting stale shares as is required for p2pool to avoid throwing away valid bitcoin blocks - which would be a REALLY bad thing to do and not have "submit-stale" enabled Tongue
Though some of the bitmain firmware does this in their version of the driver - discard valid blocks that p2pool says are stale - but you'd have to check their code for the driver you are using if it's not master cgminer.
legendary
Activity: 1540
Merit: 1001
Two additional points on my prior comment:

1 - For work that is oversize, it obviously doesn't hurt anything because it's too low to be of any value.  Any significantly sized shares (like an alt chain share) wouldn't reject.  So this only inflates your reject rate for no good reason.

2 - The poor overworked Ants have enough to do already without having to switch work sizes all the time, so fixing the pseudo share size should help in some manner or another, however so slight.

M
legendary
Activity: 1540
Merit: 1001
Same here, p2pool does it nicely  Smiley

Yes, it does do it nicely.  However if you have overburdened overworked miners, such as all Antminers are, there is a reason to set the pseudo share size.

I've pointed my former Ants through my homegrown proxy to see what causes rejects and why.  One is beyond the control of p2pool, although it can be alleviated by setting the queue size to 1 or 0.  The other, however, you can solve by fixing the pseudo share size that p2pool feeds your Ants.

Here's what happens:

p2pool: work size is now 121
Ant: got it, I'll start using 121 as soon as I can
Ant: here's some work from the prior work size of 105
p2pool: rejected!  difficulty is too high (it really says this!  in reality is too high, but too keep it simple for people, we reverse it and say it's too low)
Ant: alright, here's some work of size 121
p2pool: accepted!
Ant: and more
p2pool: accepted!
p2pool: whoa, hold on Ant, you're feeding me work too fast.  work size is now 151!
Ant: got it, I'll start using 151 as soon as I can
Ant: here's some work from the prior work size of 121
p2pool: rejected!  difficulty is too high!

And so forth.

Fix your pseudo share size, and no more rejects because of work size that is "too high".

M
hero member
Activity: 686
Merit: 500
WANTED: Active dev to fix & re-write p2pool in C
Same here, p2pool does it nicely  Smiley
Jump to: