Author

Topic: [1500 TH] p2pool: Decentralized, DoS-resistant, Hop-Proof pool - page 414. (Read 2591964 times)

legendary
Activity: 2968
Merit: 1198
sr. member
Activity: 434
Merit: 250
Switch to Electrum. Smiley
sr. member
Activity: 295
Merit: 250
Is there anyway to disable the wallet function in P2Pool and have it instead forward the percentage of shares the note is to receive to a different wallet?

I have hit a stumbling block; over a week a go I initiated a transfer of over 1/2 a bitcoin and Bitcoin-QT told me I need to account for 0.0004 of the fee, so I did that and ever since the transfer has been in limbo, it looks like bitcoin-qt gave me the wrong fee to account for it if I understand the blockchain correctly.

https://blockchain.info/tx-index/7bbbc7b5fac8c2973e9c62a17cead946b9cf1832aac47479799ce566fafabcc1

Also I need to figure out if can fix this issue or have I lost those coins? Sad

Thanks,



correct fee for that is 0.0004, the fee on link provided is 0.00004

chances of getting into a block that is less than 10k (or 27k if it's priority)  in size is slim

This goes back to my initial problem with bition-qt/bitcoind when it comes to sent transactions. I am trying to transfer 0.31779405 bitcoins and I am getting this dialog from bitcoin-qt, its not hard to imagine that with my 0.59xxx bitcoins the number was 0.00004 not 0.0004 ...



I'm reasonably sure there's a bug somewhere in bitcoin-qt that causes transaction fees to be off by a factor of ten. Basically it adds an extra zero after the decimal point. I've had three transactions hit with this issue, where the fee that bitcoin-qt asks for is exactly 1/10th what the rest of the network is expecting. This causes my transactions to take many hours (in one case, over a day) until they get confirmed.  That said, this is probably off-topic for this thread.
hero member
Activity: 630
Merit: 501
Is there anyway to disable the wallet function in P2Pool and have it instead forward the percentage of shares the note is to receive to a different wallet?

I have hit a stumbling block; over a week a go I initiated a transfer of over 1/2 a bitcoin and Bitcoin-QT told me I need to account for 0.0004 of the fee, so I did that and ever since the transfer has been in limbo, it looks like bitcoin-qt gave me the wrong fee to account for it if I understand the blockchain correctly.

https://blockchain.info/tx-index/7bbbc7b5fac8c2973e9c62a17cead946b9cf1832aac47479799ce566fafabcc1

Also I need to figure out if can fix this issue or have I lost those coins? Sad

Thanks,



correct fee for that is 0.0004, the fee on link provided is 0.00004

chances of getting into a block that is less than 10k (or 27k if it's priority)  in size is slim

This goes back to my initial problem with bition-qt/bitcoind when it comes to sent transactions. I am trying to transfer 0.31779405 bitcoins and I am getting this dialog from bitcoin-qt, its not hard to imagine that with my 0.59xxx bitcoins the number was 0.00004 not 0.0004 ...


legendary
Activity: 2968
Merit: 1198
p2pool may be a poor choice for small miners anyway, because the payouts they do end up receiving will be in many tiny chunks which will end up costing them more to spend than a single payout from a normal pool.

This is a good point. Maybe it should be possible to store deferred payouts in the share chain up to some minimum and apply them to a future block, the way eligius does. There is definitely value in coinbase payouts and the pool not having a wallet (though in practice eligius does have a wallet) while at the same time not paying out a tiny slice of every single block to every single miner.

I don't know if this can be made hop proof though.
legendary
Activity: 2968
Merit: 1198
A person that has one bitcoin that loses one bitcoin proportionally loses much more than a person that has a thousand bitcoins and also loses one bitcoin. In absolute terms, however, they've lost the same amount. I doubt either of them will be thinking of the loss in absolute terms.

Yes, I agree. Risk relative to assets does matter.

But there is no direct relationship between your hash rate and your assets, so risk relative to hash rate does not necessarily matter. Where I already said I agree is that someone who is poor (you said poor country but the wealth of the individual matters more) should be more concerned about smaller risks.

In other words, if you want to manage risk in "relative" terms, the correct denominator to use is assets, not hash rate. What this means in practice for hobby miners is they need not be overly concerned about variance.
donator
Activity: 2058
Merit: 1007
Poor impulse control.
Yes, we've covered that before, but there's no point in considering "absolute terms". Few people consider anything related to BTC in "absolute terms". In fact I can't think of many situations in which there are useful BTC related charts which are not log-linear.

Bitcoin in absolute terms are what you spend (or save). In fact the number you are quoting, standard deviation divided by mean, has no units at all. It's a number that doesn't exist in the real world, only on graphs.


Now you're just being wilfully silly.

A person that has one bitcoin that loses one bitcoin proportionally loses much more than a person that has a thousand bitcoins and also loses one bitcoin. In absolute terms, however, they've lost the same amount. I doubt either of them will be thinking of the loss in absolute terms.
sr. member
Activity: 476
Merit: 250
In case you are, or in case there are other readers as confused as I, you can think about it like this: the number of p2Pool shares a miner submits per time period is an approximately Poisson distributed random number. This means that the standard deviation of this number is proportional to the square root of the mean. As the share submission rate increases, the standard deviation only increases as the square root of the mean, so a miner with a greater share submission rates has, proportionally, a much lower standard deviation.
The key word you are using there, which contributes much to the confusion, is proportionately. In fact the standard deviation, measured in units of dollars, btc, or anything else is higher for the larger miner. It grows more slowly than the mean, but it still grows. A small miner has a small standard deviation in absolute terms.

A standard deviation in absolute terms makes no sense conceptually.
Small miners suffer from much higher variance on p2pool than large miners.
You don't care because they aren't earning much anyway, they do.
p2pool may be a poor choice for small miners anyway, because the payouts they do end up receiving will be in many tiny chunks which will end up costing them more to spend than a single payout from a normal pool.
legendary
Activity: 2968
Merit: 1198
Yes, we've covered that before, but there's no point in considering "absolute terms". Few people consider anything related to BTC in "absolute terms". In fact I can't think of many situations in which there are useful BTC related charts which are not log-linear.

Bitcoin in absolute terms are what you spend (or save). In fact the number you are quoting, standard deviation divided by mean, has no units at all. It's a number that doesn't exist in the real world, only on graphs.
donator
Activity: 2058
Merit: 1007
Poor impulse control.
In case you are, or in case there are other readers as confused as I, you can think about it like this: the number of p2Pool shares a miner submits per time period is an approximately Poisson distributed random number. This means that the standard deviation of this number is proportional to the square root of the mean. As the share submission rate increases, the standard deviation only increases as the square root of the mean, so a miner with a greater share submission rates has, proportionally, a much lower standard deviation.

The key word you are using there, which contributes much to the confusion, is proportionately. In fact the standard deviation, measured in units of dollars, btc, or anything else is higher for the larger miner. It grows more slowly than the mean, but it still grows. A small miner has a small standard deviation in absolute terms.

Yes, we've covered that before, but there's no point in considering "absolute terms". Few people consider anything related to BTC in "absolute terms". In fact I can't think of many situations in which there are useful BTC related charts which are not log-linear.

legendary
Activity: 2968
Merit: 1198
In case you are, or in case there are other readers as confused as I, you can think about it like this: the number of p2Pool shares a miner submits per time period is an approximately Poisson distributed random number. This means that the standard deviation of this number is proportional to the square root of the mean. As the share submission rate increases, the standard deviation only increases as the square root of the mean, so a miner with a greater share submission rates has, proportionally, a much lower standard deviation.

The key word you are using there, which contributes much to the confusion, is proportionately. In fact the standard deviation, measured in units of dollars, btc, or anything else is higher for the larger miner. It grows more slowly than the mean, but it still grows. A small miner has a small standard deviation in absolute terms.

sr. member
Activity: 434
Merit: 250
Yes that is the question. I'm not sure you can flat out dismiss "technical" concerns. Maybe things like graphs and so forth are important. There may be other feature issues. For example, I know a few people have mentioned automatic splitting of payouts for farms with investors and group buys as being an attractive feature of ghash.io. I'm personally using the fee function in p2pool for that, but that only gives a two-way split, and it means I can't really open up my node for public use, which I might otherwise do. (There are no public nodes in my geographic area as far as I can tell.) Marketing and awareness are also important of course.

Honestly until you mentioned it this evening I'd never even seen this sort of feature mentioned before. I did some searching and here's a post talking about exactly what you said, and it is why they use ghash.io:

https://bitcointalksearch.org/topic/m.4409010

The only issue is ghash.io is holding your coins, and they pay out on precise %s. In p2pool's world, you can't split a share like that so the best you can do is approximate it by allocating work with the mining address assigned out with the %s you want to pay out to. Which won't be as exact.

Edit: You know, it'd be possible to code this up where it's all passed in the username. Does a pool username have a max length in programs like cgminer/bfgminer? Since + and / are already used, one could add support in p2pool for a syntax like ADDR1*.5,ADDR2*.3,ADDR3*.2 as your username to split the payment address in work servers to your miner(s) randomly between those addresses at those %s.
sr. member
Activity: 434
Merit: 250
No not me. I'm only splitting two ways so the fee function works for that. I did consider at one point that if I needed to split more ways I would find the fee function (random split) in the code and modify it, but I haven't needed it yet.

Ah ok. In addition to the struggle to think of things that would entice miners to use my nodes for some appropriate fee %, there's also the feeling anything I do I should just commit to the project and make open source as well. Which, of course, then ends up giving me nothing but a sense of accomplishment but no way to pay for my servers. Smiley
donator
Activity: 2058
Merit: 1007
Poor impulse control.
To be clear: I am not arguing that p2Pool should be made more minable by smaller miners. I am explaining to you how CDFs vary significantly for different hashrates. Someone with a large hashrate might see a +/- 1% variation in earnings per week. Someone with a small hashrate might see earnings change to 0.01% or 1000% of the previous week's earnings. I think you're not understanding how significant that is.

It's significant only insofar as people make it out to be significant. If a hobby miner's earnings vary from $10 to $0, that might well be less significant than a larger scale miner whose earnings drop from $5000 to $4000 but who spent many thousands of dollars on mining gear and has an electricity bill of hundreds or thousands of dollars per month. A $10 shortfall is a $10 shortfall, and can't ever make more than a $10 difference. You can't change that at putting some scary percentage number on it.

In any case, as I suggested earlier, the focus on very small miners is misguided. What p2pool needs is more large miners. Adding a few thousand 5 GH miners won't make any real difference. Getting KnC or whoever that was to mine their 1000 TH on p2pool instead of eligius changes the whole picture.

I'm not following, bud. Are you going off on a tangent here? Wink Or maybe what you address in this post isn't really my concern? I'm not saying it's bad, or wrong, just not my concern.

Either way I'm fine with that, but I want to be certain you weren't still discussing the difference between large and small miner variance.

In case you are, or in case there are other readers as confused as I, you can think about it like this: the number of p2Pool shares a miner submits per time period is an approximately Poisson distributed random number. This means that the standard deviation of this number is proportional to the square root of the mean. As the share submission rate increases, the standard deviation only increases as the square root of the mean, so a miner with a greater share submission rates has, proportionally, a much lower standard deviation.
legendary
Activity: 2968
Merit: 1198
Yes it is added variance. This is accepted in understood by the participants in my case. In practice the payouts seem to be fairly close (it is shares that are being split probabilistically and over the payment window the number of shares is fairly large). As you said it reduces the changes of wallets being compromised and other problems.

Were you the one I chatted with about it on #p2pool? It's been long enough I don't recall who it was now. Smiley

No not me. I'm only splitting two ways so the fee function works for that. I did consider at one point that if I needed to split more ways I would find the fee function (random split) in the code and modify it, but I haven't needed it yet.
sr. member
Activity: 434
Merit: 250
Yes it is added variance. This is accepted in understood by the participants in my case. In practice the payouts seem to be fairly close (it is shares that are being split probabilistically and over the payment window the number of shares is fairly large). As you said it reduces the changes of wallets being compromised and other problems.

Were you the one I chatted with about it on #p2pool? It's been long enough I don't recall who it was now. Smiley
legendary
Activity: 2968
Merit: 1198
Yes that is the question. I'm not sure you can flat out dismiss "technical" concerns. Maybe things like graphs and so forth are important. There may be other feature issues. For example, I know a few people have mentioned automatic splitting of payouts for farms with investors and group buys as being an attractive feature of ghash.io. I'm personally using the fee function in p2pool for that, but that only gives a two-way split. Marketing and awareness are also important of course.

I chatted with someone on #p2pool once who did a probabilistic payout model. That's more random for the people you are paying than if you collect the coins yourself and then pay them evenly. But of course that does open yourself up to wallet theft, etc. I'm a big fan of paying directly to people and not holding coins whenever possible.

Yes it is added variance. This is accepted in understood by the participants in my case. In practice the payouts seem to be fairly close (it is shares that are being split probabilistically and over the payment window the number of shares is fairly large). As you said it reduces the changes of wallets being compromised and other problems.

I don't really agree that "you could code it yourself" is a good answer to a missing function that could attract more miners. I obviously don't know how many miners this particular function would attract.


sr. member
Activity: 434
Merit: 250
Yes that is the question. I'm not sure you can flat out dismiss "technical" concerns. Maybe things like graphs and so forth are important. There may be other feature issues. For example, I know a few people have mentioned automatic splitting of payouts for farms with investors and group buys as being an attractive feature of ghash.io. I'm personally using the fee function in p2pool for that, but that only gives a two-way split. Marketing and awareness are also important of course.

I chatted with someone on #p2pool once who did a probabilistic payout model for a group of people who mined with him on his private node. That's more random for the people you are paying than if you collect the coins yourself and then pay them evenly. But of course that does open yourself up to wallet theft, etc. I'm a big fan of paying directly to people and not holding coins whenever possible.

So for example, if you had 10 investors who own 10% each of the pool, then you wouldn't use the incoming miner addresses at all. You'd customize the payout code so each of those addresses had a 10% chance of being chosen as the one going into the p2pool share chain (like the way fees work). The problem is if investors get upset than investor A is being paid more than B, since it's random. A more advanced approach could record the payments actually paid out to the investors on the blockchain to their group buy addresses, and mine to the investor address who has received the least payments. Sort of a payment round robin approach, so one investor can't get too far ahead of another due to randomness. (Adjusted as needed of course if miners don't have equal ownership stakes.)

So someone in your example might find that useful. But it'd be easy enough to code without needing it as an official function for everyone.

Edit: I'm always trying to think of ways to run p2pool public nodes profitably. The proxy pool project is pretty cool and I ported that to vertcoin but didn't make a payment backend system yet. However your comment about farms/group buys wanting a feature to break out their payments in arbitrary ways more complex than just 'everything to this address' could be a value added feature worth implementing.
legendary
Activity: 2968
Merit: 1198
p2pool already works well with large miners. Cointerra's products work wonderfully with it. It isn't a technical challenge, it's a marketing/advertising/perception one.

Yes that is the question. I'm not sure you can flat out dismiss "technical" concerns. Maybe things like graphs and so forth are important. There may be other feature issues. For example, I know a few people have mentioned automatic splitting of payouts for farms with investors and group buys as being an attractive feature of ghash.io. I'm personally using the fee function in p2pool for that, but that only gives a two-way split, and it means I can't really open up my node for public use, which I might otherwise do. (There are no public nodes in my geographic area as far as I can tell.) Marketing and awareness are also important of course.

sr. member
Activity: 434
Merit: 250
In any case, as I suggested earlier, the focus on very small miners is misguided. What p2pool needs is more large miners. Adding a few thousand 5 GH miners won't make any real difference. Getting KnC or whoever that was to mine their 1000 TH on p2pool instead of eligius changes the whole picture.

p2pool already works well with large miners. Cointerra's products work wonderfully with it. It isn't a technical challenge, it's a marketing/advertising/perception one. Why don't the bulk of CT users mine with p2pool instead of some other pool? That's the question. If I owned a huge farm, using p2pool would be an automatic choice.
Jump to: