Pages:
Author

Topic: p2pool - Advancement of Decentralized Mining - Vital to Bitcoin Network Security - page 2. (Read 19474 times)

full member
Activity: 135
Merit: 107
How much do you estimate is needed to implement these changes?
sr. member
Activity: 662
Merit: 250
Push it, push it real good.
hero member
Activity: 770
Merit: 500
bitcoin and litecoin need this now , more then ever.
legendary
Activity: 1148
Merit: 1014
In Satoshi I Trust
great goal! this will help BTC and LTC!
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Recent post on reddit describes frustration with p2pool

"I tried to switch to p2pool, but it's simply not working for me. The computer I use to control my ASIC's is an old one that can not run bitcoind and p2pool at the same time. I tried, but it's just too slow. p2pool always warns about not being able to access bitcoind and CPU is constantly at 100% whenever bitcoind receives new transactions.
Next I tried to mine using other p2pool nodes that I chose from here: http://p2pool-nodes.info/ I tried to select nodes that are close, low latency, and have no fee. Unfortunately about 10% of my work seems to get rejected, not sure why. So that did not work either.
Any ideas what I could do? I love the concept of p2pool, but currently the centralized mining pools are just much more usable."

http://www.reddit.com/r/Bitcoin/comments/1usb72/p2pool_i_really_tried_but_its_not_working_for_me/
legendary
Activity: 1456
Merit: 1000
newbie
Activity: 56
Merit: 0
Im bumping this thread, this chart is looking ugly

http://i40.tinypic.com/x5eed3.jpg

ghash.io is showing 33% with 4 days chart and 36% with 24hr chart.
their cex.io scheme is getting them a lot of hash.
aa
hero member
Activity: 544
Merit: 500
Litecoin is right coin
I wish people would donate so we can finally get some proper updates to P2Pool. By not supporting P2Pool mining, people are only helping the operators of the standard pools.

You can set up your own P2Pool node (even in Windows) and mine without a fee. It's idiotic to keep dumping your coins into the pockets of the operators of standard pools.

Of course, this is ignoring the native security of mining P2Pool, where you can mine on any node and there is zero risk of your coins being stolen (since you are mining directly into your wallet).
legendary
Activity: 1232
Merit: 1094
Adding a trustless accumulator helps with dust payouts.  It doesn't affect the minimum share difficulty.

Even without a bootstrap, it could be tweaked to reward larger miners.  This acts as a direct incentive for miners to push up their share difficulty during the transition.

The lower hashing power miners don't lose out directly, since their debt is held in the block chain.  They do have a lower expected payout though.

Minimum share difficulty requires increasing the share rate.  However, ASICs have difficulty with quickly updating shares.

One way would be to update the share chain in groups of shares.

Share Group Header:

int version
hash prev: points to the previous share group header
hash[32]: points to 32 valid shares
long: timestamp
int: difficulty

Shares would be valid if they paid out based on the previous share group.

Nodes would broadcast their shares.  Once a node has 32 shares that build on the previous share group, then it would construct a new header.

The most likely outcome is that the miner which hits the 32nd share would broadcast a sharegroup containing his share and the 31 others.

A miner who finds a share has an incentive to broadcast it so that it is included in any sharegroup.

If the share rate was 1 second, the share group rate would be 32 seconds.  ASICs would only need to update once per share-group rather than once per share.  

So, you get a 32X drop in the minimum share difficulty, but ASICs still only have to update once every 32 seconds.

This would create more dust, due to the higher share rate.  So, it would need to be combined with an accumulator of some kind.

[Edit]
If shares that pointed to any of the previous 3-4 share groups were considered valid, then the orphan rate would be even lower.  ASICS could run for 2-3 minutes on the same base.
[/edit]
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Anyone concerned about the growth of silent pools (Ghash.io, discus fish etc) should support this initiative.
sr. member
Activity: 263
Merit: 250
SUPPORT DECENTRALIZED MINING TECH

https://bitcointalksearch.org/topic/p2pool-advancement-of-decentralized-mining-vital-to-bitcoin-network-security-329860
If the community donates in excess of $1,000 to Forrest's donation addresses before noon UTC December 31st, the Litecoin Dev Team will contribute an additional $1,000 in support of  research and development of p2pool.

https://blockchain.info/address/1KxvX5Hx8nh36ig2gT5bpeEcqLQcwJsZGB
0.114995 BTC
http://ltc.block-explorer.com/address/LPfkfi2tMuGSc64PZTsP9Amt367hwAUQzY
204.00221964 LTC

The donation addresses must increase in value by > $1,000 above the current received totals.

Why are we doing this?
Litecoin Dev already donated $2,600 to Forrest earlier this year.  We strongly believe that p2pool improvement is critical to the future of Bitcoin so we want to encourage others to join us in supporting this cause.
sr. member
Activity: 263
Merit: 250
Just read the Brainstorming document. Some great plans.


Fee on each share is a smart idea, I didn't realise the payout per share is value/difficulty weighted as things are now. I now see the wisdom in larger miners upping their share diff threshold.

Trustless Accumulator (both variants) would be vital, infrequent dusty payouts to those with relatively small hashing power is a real barrier.

Multi-threaded share propagation is potentially good, and could work very well with the per-peer Statistical Tracking (as mentioned). Even though it does little in practical over-time terms to alter the payouts, it would increase user perception of "less wasted work" in the system as a whole. Although I'm not sure whether this won't just have the effect of increasing the effective granularity of stales, as there will still be the same number of shares in the chain.

Per-peer Statistical Tracking in it's own right is great for encouraging the kind of miner who uses professional hosting, and so contributes to a perception of a professionalism. Gives those miners a perceived high worth status, even if it's only in terms of the stratification of connections. And of course orphans could be less prevalent.

Glad to see someone clearly understands the issues involved. =)
legendary
Activity: 3430
Merit: 3080
Just read the Brainstorming document. Some great plans.


Fee on each share is a smart idea, I didn't realise the payout per share is value/difficulty weighted as things are now. I now see the wisdom in larger miners upping their share diff threshold.

Trustless Accumulator (both variants) would be vital, infrequent dusty payouts to those with relatively small hashing power is a real barrier.

Multi-threaded share propagation is potentially good, and could work very well with the per-peer Statistical Tracking (as mentioned). Even though it does little in practical over-time terms to alter the payouts, it would increase user perception of "less wasted work" in the system as a whole. Although I'm not sure whether this won't just have the effect of increasing the effective granularity of stales, as there will still be the same number of shares in the chain.

Per-peer Statistical Tracking in it's own right is great for encouraging the kind of miner who uses professional hosting, and so contributes to a perception of a professionalism. Gives those miners a perceived high worth status, even if it's only in terms of the stratification of connections. And of course orphans could be less prevalent.
legendary
Activity: 1078
Merit: 1006
100 satoshis -> ISO code
Fantastic initiative Warren. Donation on the way...
legendary
Activity: 1232
Merit: 1094
A nice feature would be a distributed memory pool.

There could be a field in the links on the share chain for validated transactions.  The entry for each transaction would include;

- the transaction
- the path to the merkle root for that transaction
- the input transactions

Providing all the input transactions is expensive.  With 2 input transactions at 250 bytes, that means you need to provide 750 bytes worth of transactions.  If the merkle tree is 12 levels deep that is an additional 32 * 12 bytes = 384.  That is around 450% larger than just the transaction.

With large multi-input transactions, it would be bigger.  If the transactions per share chain link were limited in size, then naturally those transactions would be discouraged.

Double spending could be protected against by having a system where you can claim a share by showing a transaction included in the share was double spent. 

Nodes can't add the transaction to the share chain's memory pool without proving the transaction exists (and providing the inputs so that all verification information is provided).

The owner of the share which added a transactions might get the tx fees (or a percentage of them).  This would encourage nodes to add transactions.  If an illegal transaction is added, then that share goes to the address which submitted the notification of the error.

If it became popular, then even SPV wallets might store all that info for coins held in the wallet, and transmit it when trying to spend the coin.

Combined with something like the "Ultimate Blockchain Compression", maybe even double spending could be done locally.

There could be a network rule for how to pick transactions from the pool.  This would mean that all nodes mine against the same block.

Transactions that have been added would not be included for at least a few shares.  For example, if there was 10 second shares and transactions in the 12 most recent shares were not used, then there would be 2 minute for illegal transactions to be detected.
newbie
Activity: 56
Merit: 0
Just sent 20mBTC
With decentralized system we are our own bank and I believe anyone involve has a responsibility of keeping the network safe. Im doing my part with the donation, hope it will help your effort.
legendary
Activity: 3430
Merit: 3080
I'm sure one of the reasons centralised pools are more popular is simply that the profit motive pushes them to a higher level of professionalism, and miners respond to that.

Otherwise known as the "I like the interface" position.


The main issue to me is the competitivity of the p2pool node hardware. I believe that increased usage could be easier to encourage if something can be done to slim down the resource overheads on p2pool, but it's difficult to see how the disk space and disk access performance could be improved (at least with the present sharechain design). The memory and CPU requirements could be improved within the current design, but the obvious solution would sacrifice the ease of platform portability that is currently enjoyed with using the python runtime. And obviously a C or C++ re-write would be a massive job, particularly if it were to embrace a range of platforms, a problem that's already solved.

I guess there is some respite in the form of improvements to bitcoind's memory usage with the -disablewallet configuration option coming soon/available now in a testing branch, but I can't help wondering that time passing and the progress it brings might be most decisive. Did Pieter Wuille's custom ECDSA re-implementation ever get merged? That sort of thing will be more important when the upper limit to the block size inevitably changes, in whatever way is eventually decided upon.

I'm thinking along the lines of being able to easily adapt low-cost computing devices into p2pool nodes. A change to the block size may force the disk space and performance requirements out of the feasibility zone, but every other requirement is unlikely to become so unwieldy. In just a few years, the typical low cost computing device in the Arduino/RasPi mold may be more than capable of all the performance characteristics that good operation of the p2pool/bitcoind setup requires (taking into account the balance of increasing transaction frequency, downward revision in processing requirements per transaction and the upward direction of the processing capabilities of the latest ARM designs). At the present time though, some form of high performance desktop machine just has to be used as a p2pool node, there is no real alternative if you want to make the most of your hashrate.

The reason for the low-cost device angle is obvious: all new mining devices either include or rely on a computing device with networking controller, ethernet or Wifi, and at least enough performance to run unoptimised builds of the mining software (until the developers can get a device to test and code with). To make the stage where unoptimised miner code/drivers with as much comfort margin as possible will become the norm, and so the manufacturers may begin to choose over-specified devices as the more prudent option (it could even begin to help drive down the unit cost of these sorts of low cost computing devices in itself). Once the mining code for a given device is optimised, the now vacated headroom could be leveraged for running p2pool. Can it be done now with the current version of python and it's memory management, on our current generation of mining ASICs? No is the answer. Can native p2pool (and bitcoind) builds be practicably produced for the processor architecture of every possible low-cost computer used as a mining controller? I expect no is the answer to that question too.

But there must be some opportunity to leverage the processing controllers that inevitably form a part of nearly every typical miner that rolls out of the manufacturers doors. Maybe then someone might be (more necessarily) motivated to work on a shiny-tastic web interface  Cheesy
legendary
Activity: 1526
Merit: 1134
Stopping existing miners from leaving sounds like the most important thing for sure, but after that is stabilised I think basic stuff like a proper website and more professional documentation/help/installers could go a long way.

I'm sure one of the reasons centralised pools are more popular is simply that the profit motive pushes them to a higher level of professionalism, and miners respond to that.
staff
Activity: 4284
Merit: 8808
Wonder if it makes sense to prioritize the transaction of volunteer miners as an incentive.
It wouldn't be hard for p2pool to at least prioritize it's users spending their generated coin— but P2Pool has never had a large enough share of the global hashrate to make it seem worth doing.
hero member
Activity: 784
Merit: 1000
Wonder if it makes sense to prioritize the transaction of volunteer miners as an incentive.
Pages:
Jump to: