Pages:
Author

Topic: Getting rid of pools: Proof of Collaborative Work - page 2. (Read 1861 times)

legendary
Activity: 1456
Merit: 1174
Always remember the cause!
@anunymint
Miners who begin finalization process need to accumulate enough shares, Once a share is found with a difficulty much higher than what is needed  they can choose to eliminate few smaller shares (typically the ones that don't belong to themselves) and include the newly minted one, keeping the sum at the 95% threshold needed. This will distribute the 'bad beat' consequences between more miners.

Thanks a lot for your reviews you are truly helpful, I appreciate it, I maintain that PoCW, this proposal, is rigid, tho.

By the way, your introduction of my proposal as a design that
Quote
changes the winning block solution from a single Schelling point to multi-inflection point game theory
is formulated in a somewhat sneaky way.

Distributing what you call "a single Schelling point" is nothing less than eliminating it!

For proximity premium flaw of traditional PoW, e.g. the flaw is caused by the very fact that a single miner happens to find a block for which people are killing themselves while he has already started mining new block (after relaying his discovery). It puts him (and his peers in the next rank) in a premium state which can leverage it to perform better in the next phase and so on.

In PoCW, we have tens of thousands of hot zones (new share found events) distributed all over the network. One can hardly categorize it as a premium to be in the focal point of such zones (the lucky miner) or be closer to it, simply because it is not big news at all and happens frequently and evenly distributed in the network.

I think it is very important to remain relativistic (as you always like to mention): PoCW is an improvement, it improves PoW relatively, tens of thousands of times.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
Just edited the starting post to improve the protocol for  guaranteeing a minimum reward for miners of Prepared Blocks in case transaction fees might not be enough and at the same time encouraging them to commit more transactions in the Net Merkle Tree (probably with higher fees) by dedicating 30%of the respected fees to the finder, it is traded with 1% of the block reward.

For the time being it is not done by a complete rewrite of the article, just a comment has been added to the end.
jr. member
Activity: 56
Merit: 3
ONNI COIN! The New Paradigm!
@aliashraf @anunimint

Congratulations you progressed despite the high entropy of the thread. Very Glad!

Implementation and testing phase, even to mvp, will shine light on any unresolved issues and hopefully they will be addressed.


@anunimint  How are you getting on with your health (read your post about the complications)... Wish you to get better and better soon.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
@anunimint

I'm not going to patronize you, but you have been helpful and productive here, I'll mention your contribution in the white paper when discussing possible objections including the ones you made this far and the way Proof of Contributive Work is supposed to resist against them.


legendary
Activity: 1456
Merit: 1174
Always remember the cause!
{....}
We will see how breaking the prison will escalate more improvements later. specially I'm very optimistic about on chain scaling by sharding infrastructure utilizations inherent in PoCW.

You already lost that debate up-thread. You cannot increase your factor to 1 million or 10 million to accommodate unbounded scalabilty.

Besides I still do not think PoCW even works at a 10,000 factor.

And proof-of-work will centralize and there’s nothing you can do to ameliorate that outcome.
Unbounded scalability is not needed at all and latest calculations by @tromp suggest a practical scalability for PoCW.

By reducing block time to 1 minute and applying 0.0001 minimum threshold rule for shares we reach to 10-5 scale down needed for keeping a single S9 enabled to participate in solo mining bitcoin by producing an average of 2 shares per hour.

As of latest calculations we need an average of shares needed in scale/ln(scale) order that is promising for scaling down difficulty to 10-7 with just 72,000 shares per round (1200 shares/second) in average.

10-7 scale down is more than sufficient for 1000 times increase in network difficulty(how many decades later?) , because we expect at least 10 times better smallest-miner-to-protect and 1200 shares/second is not that high for 2050, I suppose.

So, unlike what you claim, scalability debate is not lost for PoCW.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
@anunymint

Bitcoin is not just about security, it is far more about decentralized issuance of money and transacting it. Money is not wealth, it is not capital it is just a device. Its fungibility essentially makes it such a device.

A new monetary system, of any kind, will have to face scalability and flexibility issues and it is absolutely possible in its earliest stages of development to suffer from shortages in this regard. Improvement is the solution.

Betraying PoW and Satoshi is your right (and Butterin's and PoS enthusiasts too).

But my choice is different, I remain loyal and try to fix issues. This proposal is about fixing two important flaws in PoW, and will scale bitcoin 10 times at least while keeping it fully objective and decentralized.

We will see how breaking the prison will escalate more improvements later. specially I'm very optimistic about on chain scaling by sharding infrastructure utilizations inherent in PoCW.

PoCW is just an evidence that shows the feasibility of improving instead of giving up and sticking with subjective, failed alternatives. If they were not just failed  ideas, how was it possible at all for bitcoin to rise?

Anyway I'll go through your links and will discuss them with you probably in separate threads as I promised, for this topic, I think we can forget about strategic and visionary issues and take it just as it is, a case study for an improvement proposal to PoW
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
P.S. If you understood anything about the Old South, you would understand that is the way you talk to other people that makes you a bonafide citizen of that culture. I do not talk to people on this forum the way I talk to people in my culture, because very few here have the etiquette of the Old South. So on this forum I get to be as big of an asshole as others are to me. It is pure defect-defect because the forum is anonymous. Although I have a made a decision to try to exhibit much greater patience and tolerance.
No worries dude, I will do my best to keep this debate alive as long as some meat is served here  Wink
About Old South, ....  you already know,  I'm a fan!

I introduced PoCW not for giving birth to a new coin, improving bitcoin and  Ethereum (while saving it from Butterin's Caspar coup d tate) are my main concerns. As one of the first contributors to this topic has correctly emphasised the main challenge here is political. I started this topic to spread the word and the idea not to convince people to join a scammy shitcoin project but to find about who is who  and how is the weather.  The next step is implementing the code and demonstrating how feasibly smart and clean I can do this.

I'm having a long war to win and I don't play this game so cautiously: saying nothing unless you got an army of academicians in your back. It is not the way I fight, when I find the answer I show up with it and start fighting and will fight to the river, as I have told you elsewhere. I don't hesitate and don't postpone everything for paperwork.


Proof-of-work is “might is right.” Why are you idealistic about fungible stored capital enslaves mankind? You are just fiddling with the fringe of the issue. The central issue is that proof-of-work is all about the same ole paradigm throughout human history that says fungible stored claims on someone else’s labor is powerful.

I reject that and we are headed into a paradigm-shift which will render the NWO reserve currency Bitcoin irrelevant. Security of fungible stored capital is not the big deal it was before in the fixed investment capital Agricultural and Industrial ages. You need to understand that you are barking up an old, rotten tree with old, rotting Old World money in its midst.

Read every linked document in this blog, and my comments below the blog and wake up:

https://steemit.com/cryptocurrency/@anonymint/bitcoin-rises-because-land-is-becoming-worthless
It is the true story behind this debate, isn't it?
At a very unfortunate moment of your history with bitcoin and PoW, you made a horrible decision: giving up on both!
Cultures won't help, people just are the same, no matter to when or where they belong, they just give up when they become disappointed.

For me, this is a different story. When I find something brilliant, I don't care about its current state of development, brilliance is enough for me to commit and not to give up on it, no matter what.

History teaches us another important lesson too: When a paradigm shows up, it will stay for a while and it is pointless and mostly impossible to have a paradigm shift every decade.
I will check your link and I'll go through your replies as you wish, I promise, but, I have to say, I'm strategically against any attempt to replace PoW, it seems to me just a fake ridiculous attempt, a cartoon. Sorry, but it was you who chose the wrong side.

If by formalism you mean a lot of mathematical analysis to address every single possible attack or vulnerability, I think it is too much in this stage.

I think it’s impossible to not have acrimony without it. You just assume epilson without actually proving how small are the effects you dismiss.

{....}

You need to show the math and prove it is epsilon.

And that includes also your presumption of “for few seconds per minute.” Depends on the variance of the miner.
I don't agree. It is always possible to discuss issues without going through formal and mathematical analysis. I did some formal analysis of this specific subject of your concern (variance in transition period) and have shown how sharp is this period.
But now you are unsatisfied and keep pushing for more details which I just can't schedule more time for , and if I do it, nobody will read it, not now.

Quote
There is also a countervailing force which you could argue for, which (I mentioned up-thread) is the investment in shares the miner already has and the amount of luck he adds to the block being solved if he does not stop mining. But that is probably a neglible factor and Vitalik already explained that altruism-prime is an undersupplied public good (see the weak subjectivity Ethereum blog), so I doubt miners would be able to agree to not defect for the greater good. It’s a Prisoner’s dilemma.

Again you need to show the math.
Please! You probably know my opinion about this kid, Buterin and his foolish "weak subjectivity" thing. It is a shame, a boy desperately obsessed with being genius, is trying to revolutionize cryptocurrency by 'weak' shits. Absolutely not interested.

As of the proposed 'additive' for miners not to back-off because of the hypothetical variance in transition phase, thanks for reminding and I'm fully aware of that, I just didn't bring it forward to avoid complicating the subject even more.

Anyway, it improves the odds and can't be rejected by the boy's "discovery" of altruism not being the dominant factor  in a monetary system  Cheesy
It is not about well being of others.
Miners have always incentive to have their own previously mined shares (in the current round) to be part of the chosen %93 and their late shares besides direct rewards will help this process.

Quote
Actually perhaps the most logical action is for smaller miners to switch to centralized pools for a portion of the block. Probably it will be difficult to argue against that mathematically. So that if true, probably more or less defeats the stated purpose of PoCW.
I think, there is a possibility (not a force) for some kind of pooling in PoCW. But it won't be the same as conventional centralized pools even a bit (it doesn't need to be) and won't defeat the purpose being eliminating the pooling pressure and its centralization consequences.

I have to analyze it far more, but I guess a light gradient exists here in favor of forming kinda 'agreements' between clusters of small miners to communicate in star topologies to help each other transiting more smoothly. It is a light gradient, as there is very low stakes ( 2% or so) on the table.

One should again take into consideration the way PoCW fixes proximity premium and practically synchronizes miners to transit between phases almost in the same time, as I have already discussed it extensively, implying short transient periods and less incentives for setup/cleanup costs needed to join a  pool temporarily.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
Of course it ameliorates your objection. How could it possibly do anything else after all?

Craig Wright is an obnoxious, stubborn, boastful person who fell flat on his face when he finally tried to formalize and published a whitepaper.

I’ll await your formalization, as I awaited for a several years for Radix to finally specify their flawed design.

Although formalism is not my first priority right now and instead implementation is, I have presented this proposal in a formal way. The algorithm and the terminology have been described in details and free of ambiguity. You might have noticed until now, I've made only one major edit as a result of discussion with @ir.hn. If by formalism you mean a lot of mathematical analysis to address every single possible attack or vulnerability, I think it is too much in this stage.

I'm not suggesting it generally to postpone a more formal and finalized whitepaper but for this special project, implementation is of higher priority. Let me explain:

I introduced PoCW not for giving birth to a new coin, improving bitcoin and  Ethereum (while saving it from Butterin's Caspar coup d tate) are my main concerns. As one of the first contributors to this topic has correctly emphasised the main challenge here is political. I started this topic to spread the word and the idea not to convince people to join a scammy shitcoin project but to find about who is who  and how is the weather.  The next step is implementing the code and demonstrating how feasibly smart and clean I can do this.

I'm having a long war to win and I don't play this game so cautiously: saying nothing unless you got an army of academicians in your back. It is not the way I fight, when I find the answer I show up with it and start fighting and will fight to the river, as I have told you elsewhere. I don't hesitate and don't postpone everything for paperwork.

Distributing a risk tens of thousand times in the network and neutralizing proximity premium (what PoCW definitely achieves) is not a simple point to be overlooked easily. When you participate in a network with relatively good distribution of information and luck, you just don't back off because you are afraid of not hitting every single round. It is just part of the game. The only concern is always about how distributed and fair this game is.

Correct. You’re also describing Nakamoto proof-of-work, in which small miners must join pools, because the variance risk is always too high for the entire block period.

Analogously, I claim that PoCW creates increasing variance risk later in the block period. So again smaller miners need to turn off their mining the closer to the end of the block period, and wait to mine the next block again.
Unless Nakamoto's PoW suffers from being vulnerable to mining variance in the whole period and mine is in danger just in a short transition period Plus in Nakamoto's case we have a single focal point of information and proximity premium while in my proposal, PoCW, we compensate for the said danger by distributing the information (new shares) tens of thousands of times.

You totally ignore both differences and I don't know why.
You are suspicious that when the total score of shares that the miner is aware of is close enough to 0.93 threshold, a rational miner may stop mining, take a break, waiting for probable more shares to come and switching to next phase because there is more chance for his newly mined shares not to be included in the finalized block and it will be waste of ... wait ... waste of what? Electricity? Because the rents are already paid. aren't they?

Electricity is not already paid. If there is any form of flat-rate mining hardware hosting account which does not meter electricity on a usage basis, then the account is not profit to mine with, because electricity is the major cost of mining.


What I was trying to say is that mining involves several cost factors: rents, hardware depreciation, wages and electricity. Hypothetical back-off strategy just can help reducing electricity costs for few seconds per minute by relaxing the miner from hashing. I suggest even with high electricity fees it won't trade-off with dropping the chances to hit and be rewarded.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
AFAICT, my objection is not ameliorated by any randomization or distribution.

Of course it ameliorates your objection. How could it possibly do anything else after all?

Distributing a risk tens of thousand times in the network and neutralizing proximity premium (what PoCW definitely achieves) is not a simple point to be overlooked easily. When you participate in a network with relatively good distribution of information and luck, you just don't back off because you are afraid of not hitting every single round. It is just part of the game. The only concern is always about how distributed and fair this game is.

Quote
I’m sorry I am not going to repeat the economics again. AFAICT you are simply not understanding.


On the contrary, I do understand every bit of your objection and more ...
It is by no means about "economics" , you are simply questioning the incentives being enough compared to the risks. It is not complicated that much to be called  economics.
Accusing me of being ignorant about such a simple trade off between costs and expenses ... , well it is too much in this context.

Quote
The miner will simply observe that by not mining within a window nearer to 0.93 will be more profitable. If they have ambiguity around the window, they’ll simply have to back off to an even lower threshold or use a timeout.
Now let's have a bit more "economical" assessment here:

You are suspicious that when the total score of shares that the miner is aware of is close enough to 0.93 threshold, a rational miner may stop mining, take a break, waiting for probable more shares to come and switching to next phase because there is more chance for his newly mined shares not to be included in the finalized block and it will be waste of ... wait ... waste of what? Electricity? Because the rents are already paid. aren't they?

So it will be about risking electricity expenses in a 2-3 seconds duration against a proportional fair chance to hit and be the first (almost) who transits to Finalization phase.

Quote
But I have nothing to gain by continuing to explain it. Cheers.

Note that doesn’t mean I am 100% certain. I would need to do some formalization. And I am unwilling to expend the effort.

Neither do I. It is too luxurious for this protocol to be analyzed to that extents, I'll leave it as it is. For now, I'm convinced that no back-off threats practically exist. Miners will simply take their shuts because the crisis threshold is very narrow, as I have proved before.



legendary
Activity: 1456
Merit: 1174
Always remember the cause!
You wrote nothing to refute my technical objection. Appears you don’t even comprehend the economics I presented.

No, I addressed it by :
Quote
This is why I'm convinced that network phase transition  occurs synchronously, in a very short window of time.

This analysis, confirms your concern about risks of the latest shares not to be Finalized ever, but it shows that it is about a very short duration and the chances are distributed more evenly across the network.

To make it clear and to be more precise:

In two windows of time, in the early seconds of contribution and in the last seconds (both being very short as I have argued above for the latter and have shown in my analysis of convergence process for the first window)  there is a chance for shares not to be finalized.

The proximity premium resistance of the algorithm, will compensate this risk by distributing it evenly in the network.

Note: I'm not sure yet but I suppose the latest point, the risk being distributed, has an interesting implication: In long term, it is no risk at all, it is part of the protocol and is automatically adjusted by target difficulty.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
I received a cordial PM from @aliashraf.
Smiley I feel good about you despite the bitterness and this topic is about collaborative work after all. PM'd just to keep topic focused.
I am not trying to tell others to not participate. I am not trying to railroad your thread. I am not trying to belittle or insult you (although I got angry that you were violating my culture above and were like forcing me into an acrimonious discussion about something I am not that interested). None of that.
Good start. I maintain my strategy about you to commit more, tho.

As the shares accumulate and their summed difficulty approaches 0.93 of the target difficulty, the variance of profit skyrockets exponentially. IOW, the ROI for continuing to mine shares declines exponentially approaching the finalization of the block.

I couldn't figure out how approaching the required share difficulty may result in the situation you suggest.

When a miner is able to generate a Shared Coin Transaction (i.e. he is aware of enough valid shares with same Net Merkle root and a cumulative score of at least 0.93) it is time for him to immediately switch to Finalization phase i.e. mining a block with at least 2% difficulty compared to networks calculated target difficulty.

The block our miner is trying to mine in this phase points to a Shared Coinbase Transaction that has miner's address in its very first row and rewards it exactly 2%, so, there is a fair reward proportional to the required difficulty which is 2% equivalently.
The lucky miner who manages to find such a block

When mining shares early in the Shared Coin Transaction phase, the variance of the smaller miner is less of a cost because his share will still likely be produced before the window of time expires. But approaching 0.93 level of accumulated shares, it becomes more costly from a variance weighted analysis for the smaller miner to risk starting the search for a share solution. The math is that the miner will be less likely to earn a profit on average over the short-term of a few blocks when mining closer to the expiration time. Over the long-term, I think the variance skew may average out, but the problem is that time is not free. There’s a cost to receiving profits delayed. This is why smaller miners have to join a pool in the first place.
In Contribution Phase, miners won't interrupt mining and continue brute forcing the search space, unless they realise that one of the two following events is happened:
1- Another Net Merkle root (Prepared Block) is getting hot and they are in danger of drawing dead, This is supposed to happen in the very first beginning of contribution phase and they will respond by switching to new trend.

2- The 93% limit is reached. It happens as a sharp simultaneous event across the network and miners react by switching to Finalization Phase simultaneously, (in a very short interval, say 2-3 seconds, I guess).

Wait! do not rush to the kb to muck at me or to teach me what I don't really need to be taught, just, continue reading ...


I understand, minds poisoned by traditional PoW, for which propagation delay is a BIG concern, can not imagine how it is possible, but it is exactly the case with PoCW:

Suppose we are approaching the 0.93 limit (say 0.736, 0.841,0.913, ... ) from the viewpoint of a miner A. Shares are receiving and it is getting closer and closer to the limit ...

What would be the situation for miner B (say, at the end of longest ever, short path from A)?

B can feasibly b e experiencing (0.664, 0.793, 0.879, ...) at the same time!
 
Before proceeding further, let's understand how this situation is feasible at all ...

Looking closer to the shares that A has validated and is accumulating to calculate the series 0.73, 0.841,0.913, ... and the ones B is using to generate 0.664, 0.793, 0.879, ... may reveal an interesting fact: they are not exactly the same and specially when it comes to newer shares they diverge meaningfully!

Suppose the collection of all the shares (regardless of their miners and how far they have been propagated) to be
S={s1, s2, s3, s3, s3, s3, ... ,, sn-5, sn-4, sn-3, sn-2, sn-1, sn}

Obviously SA and SBare subsets of S representing how far each of the miners, A and B, are aware (being informed about or the source of each share) of S.

As there is propagation delay and other reasons for any miner to 'miss' a member of S, it is completely possible to have
SA = {s1, s3, s4, ... , sn-5, sn-4, sn-2 , sn} *
* Missing s2, sn-3

SB = {s1, s2, s4, ... ,  sn-5, sn-3, sn-1}*
* Missing s3, sn-4, sn-2, sn

They have most of the shares in common but they have not access to all the same shares, they don't need to, Miner B may suddenly receive more shares from the adjacent peers and find himself closer to 0.93 limit and so on,...

It is how PoCW  mitigates the troubles usually we are dealing with because of network propagation problem, we distribute information almost evenly across the network and reduce the proximity premium weight and importance.

This is why I'm convinced that network phase transition  occurs synchronously, in a very short window of time.

This analysis, confirms your concern about risks of the latest shares no to be Finalized ever, but it shows that it is about a very short duration and the chances are distributed more evenly across the network.







legendary
Activity: 1456
Merit: 1174
Always remember the cause!
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
@tromp
Thanks, for the explanation.
I checked a few references, 1086 scores for 10,000 scale down is not exactly true as we have variance > 0 in distribution of scores, as you have mentioned correctly. But we also should notice that for large number of blocks we have another convex function for n (being the number of shares) distributed randomly with an order of magnitude less variance (my intuition) and its expected value is what we are looking for. It is beyond my expertise to go much further and discuss this problem thoroughly, tho.
It would be of much help if you could spend some time on this  and share the results. Both for a more active discussion and for keeping this topic reserved for more general discussions, I have started another topic regarding this problem.

By the way, for practical purposes I suppose we can confidently use our previous estimation for n (1400), as the least optimistic one, for now.
legendary
Activity: 988
Merit: 1108
As I understand, the expected value is also known as the expectation, mathematical expectation, EV, average, mean value, mean, or first moment (the way Wikipedia defines it)


Correct.

Quote
and my primitive observation says once you have the  expected value/average of a finite number of uniformly distributed random values and their  total sum you have the cardinality by means of dividing sum to the expected value /average you got, if the variance is zero or very low which is true for a pseudo random function like sha2.

In your case you have a (potentially unbounded) sequence of i.i.d. random variables S_i
(score of i'th share) and a separate random variable N depending on all S_i which is the minimum n for which the sum of the first S_i exceeds 1.
Of course if the S_i have 0 variance then N = ceiling(1/S_i).

A closely related case is where there is a single random variable S and N is just 1/S.
In that case Jensen's inequality [1] applies and you have E(N) >= 1/E(S) , with equality only for Var(S)=0.

I'm not sure to what extent Jensen's equality carries over to your case.

[1] https://en.wikipedia.org/wiki/Jensen%27s_inequality
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

That's only approximately right. You do have that the expected sum of 1086 scores exceeds 1,
since expectation of a sum is sum of expectation, but asking for expected number of shares to exceed 1 is something else.
As I understand, the expected value is also known as the expectation, mathematical expectation, EV, average, mean value, mean, or first moment (the way Wikipedia defines it) and my primitive observation says once you have the  expected value/average of a finite number of uniformly distributed random values and their  total sum you have the cardinality by means of dividing sum to the expected value /average you got, if the variance is zero or very low which is true for a pseudo random function like sha2.

What am I missing?
legendary
Activity: 988
Merit: 1108
if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

That's only approximately right. You do have that the expected sum of 1086 scores exceeds 1,
since expectation of a sum is sum of expectation, but asking for expected number of shares to exceed 1 is something else.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.

Let T be the target threshold determined by the difficulty adjustment,
and scale be some suitably big number like 10^4.

Let shares be hashes that fall into the interval [T, T*scale], and define their score as T / hash.
When accumulating shares until their sum score exceeds 1, one is interested in the expected score of a share.

This can be seen to equal 1/scale times the expected value of 1/x for a uniformly random real x in the interval [1/scale,1]. Considering the area under a share score, the latter satisfies (1-1/scale) E(1/x) = integral of 1/x dx from 1/scale to 1 = ln 1 - ln(1/scale) = ln(scale).

So the expected score is approximately ln(scale)/scale.

if the expected score is  ln(scale)/scale then total number of shares  n= scale/ln(scale) by definition.
Hence,  n <1086   for 10000 times scaling down the difficulty. Right?

If it was just a school exam I wouldn't hesitate that much because as of my knowledge and up to the extent I checked it with available references it seems to be basic:
score = T/hash (checked)
probability of x = 1/x (checked)
expected value of x = integral of 1/x dx in the range [1/scale to 1] = ln(1)-ln(scale-1) = ln(scale) (checked)

Yet I'm not entitled to weigh on it, and the result (10,000 times scaling down achieved with 1,086) is too good. I just didn't expect that much efficiency.

Any more comments?
correct reasoning, I mean expected value of a variable x is defined to be  random distribution
legendary
Activity: 988
Merit: 1108
Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.

Let T be the target threshold determined by the difficulty adjustment,
and scale be some suitably big number like 10^4.

Let shares be hashes that fall into the interval [T, T*scale], and define their score as T / hash.
When accumulating shares until their sum score exceeds 1, one is interested in the expected score of a share.

This can be seen to equal 1/scale times the expected value of 1/x for a uniformly random real x in the interval [1/scale,1]. Considering the area under a share score, the latter satisfies (1-1/scale) E(1/x) = integral of 1/x dx from 1/scale to 1 = ln 1 - ln(1/scale) = ln(scale).

So the expected score is approximately ln(scale)/scale.
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
Hi, I have a stupid question, for sure I'm missing something,


  • Finalization Block: It is an ordinary bitcoin block with some exceptions
    • 1- Its merkle root points to a  Net Merkle Tree
    • 2- It is fixed to yield a hash that is as difficult as target difficulty * 0.02
    • 3- It has a new field which is a pointer to (the hash of) a non empty Shared Coinbase Transaction
    • 4- The Shared CoinBase Transaction's sum of difficulty scores is greater than or equal to 0.95

I cannot see any reward for finalization block.
where is the incentive to to mine a finalization block?

Block reward is distributed by means of Shared Coinbase Transaction in which the first transaction is supposed to be a special transaction fixed to have a score of 0.02 and obviously will refer to the wallet address of the miner (of Finalized Block).

  • Coinbase Share: it is new too and is composed of
    • 1- A Collaborating miner's wallet address
    • 2- A nonce
    • 3- A computed difficulty score using the hash of
      • previous block's hash padded with
      • current block's merkle root, padded with
      • Collaborating miner's address padded with the nonce field
    • 4-  A reward amount field
  • Shared Coinbase Transaction: It is a list of Coinbase Shares  
    • First share's difficulty score field is fixed to be  2%
    • For each share difficulty score is at least as good as 0.0001
    • Sum of reward amount fields is equal to block reward and for each share is calculated proportional to its difficulty score
legendary
Activity: 1456
Merit: 1174
Always remember the cause!
I am not much of a probability theory expert, but for now, I'm persuaded about @tromp calculations:

NOTE that you overlooked my fix where ln(n) should instead be ln(1/mindiff).
Would you please do a complete rewrite of your proposed formula, ... for clarification? Substituting ln(1/mindiff) for ln(n) just makes no sense to me or I'm missing something here.
Pages:
Jump to: