Author

Topic: [ANN] profit switching auto-exchanging pool - www.middlecoin.com - page 469. (Read 829908 times)

sr. member
Activity: 406
Merit: 250
You really gotta give him props though.  He is going through great lengths to try to persuade others to believe his way of thinking... even though, as many have pointed out, it's simply incorrect.
member
Activity: 94
Merit: 10
For each number of sample size, I run one instance of the simulation. Each one represents 1 block of a coin.

I generate a solve time for that block. Our constant variable is the block solve time, which represents an average. I generate a random number between 1/2 of it, and 1.5x of it, this is the solve time for this particular block. Remember, by the very definition of the word average, future values will be evenly balanced across both sides.

You just don't get it, do you? This is precisely what's wrong, as we've pointed it out over and over again. If you want to use solve times, you have to use those for a Poisson process. Writing a simulation that utilises the results of this wrong distribution.

Stop spreading your factually incorrect point over and over and address the criticism.

Edit:
This would be a better example of how it works. Maybe it'll make you understand it better. Note the complete lack of independence from block find time, even in the source.

Code:
import random
import numpy as np

class worker():
    def __init__(self,hashrate):
        self.hashrate = hashrate
        self.sharesolvetime = 60. / hashrate
        self.solvechance = 1/self.sharesolvetime #solve chance per unit time
        self.shares = 0

class pool():
    def __init__(self,blockfindtime):
        self.blockfindtime = blockfindtime


worker1 = worker(1)
worker2 = worker(12)

duration = 10000 #the higher the better
timestep = 0.001 #should be as low as posible,
                 # worker1.solvechance*timestep must be as close to zero as possible
                 # to be factually accurate
    
for bft in np.logspace(0.1,3,5):
    pool1=pool(bft)
    for t in np.linspace(0,duration,duration/timestep + 1):
        rand = random.uniform(0,1)
        rand2 = random.uniform(0,1)
        if rand<(worker1.solvechance*timestep):
            worker1.shares+=1
        if rand2<(worker2.solvechance*timestep):
            worker2.shares+=1
            
        
    print "Worker 1 has: " + str((float(worker1.hashrate) / float(worker2.hashrate + worker1.hashrate)) * 100) + ' percent of the hash power'
    print "But worker 1 has: " + str((float(worker1.shares) / float(worker2.shares + worker1.shares)) * 100) + ' percent of the profit'
    print "When worker1's average share-find-speed was: " + str((float(pool1.blockfindtime) / float(worker1.sharesolvetime))) + "x Block find speed"
    print "Over sample size of " + str(duration/timestep+1) + " steps \n"
newbie
Activity: 34
Merit: 0
Asking again because the first time got buried in all the difficulty... difficulties...

What solution is there for a backup/failover pool for a multi-coin pool such as this? Cgminer explicitly states that mixing blockchains is a no-no, so setting a LTC pool (or similar) as backup won't work.

Anyone?

Although the CGMiner readme does say not to work on multiple blockchains, I have had no issues using an LTC pool as a backup for this pool. Actually for the past week or so I have been using the "balance" option to split my hashrate between this pool and my preferred LTC pool. Reject rate for the LTC pool is ~0.5% and of course reject rate for middlecoin pool varies, but CGWatcher reports a total reject rate of 1.64% for the past 36 hours. YMMV, but it seems to be working just fine this way for me.
hero member
Activity: 505
Merit: 500
Cryptsy added a new cool feature for auto selling deposited coins : https://bitcointalk.org/index.php?topic=246679.680

So now you can add a switching coin profile based on profitability in cgwatcher-> manage your desired coins for mining-> start mining and switching coins based on your chosen  profitabilituy->forward the coin's wallet at the pool you're mining at to cryptsy at your comfortable threshold->select the auto sell feature at cryptsy deposit address -> coins get auto deposited and auto sold out-> collect BTC.
sr. member
Activity: 392
Merit: 250
sr. member
Activity: 392
Merit: 250
For some reason I don't thnk I'm making much with a 2-2.4 mhash miner. 0.03 btc in around 30 hours. someone else finding this odd?
full member
Activity: 141
Merit: 100
Asking again because the first time got buried in all the difficulty... difficulties...

What solution is there for a backup/failover pool for a multi-coin pool such as this? Cgminer explicitly states that mixing blockchains is a no-no, so setting a LTC pool (or similar) as backup won't work.

Anyone?
full member
Activity: 212
Merit: 100
I don't know why but I could never make this pool work more than few minutes without getting my rig getting shutdown.

I gave up for that reason and also because the host make a huge profit on you. Last but not least, a bot will never trade as good as yourself.

The bot just dump the coins. By just selling your coins a bit higher or at the good moment you make about 30% difference. Nothing hard.

I heard about 270% profit compared to bitcoin. It is small. Without switching altcoin, and without spending more than 10 minutes a day on trading I get to 300% at least.




Funny, there was a pump of a coin so h2o sold, so we made out well on that. If you like doing all that, multipool.us might be a better option for you. I like the set it and forget it. I already spend way too much time I don't have doing this crypto stuff.

H20 whats up today..   Looks like it could be a record Payout?  I am already almost up to what my normal daily payout is and its not even 11am..   Shocked

 I guess all those coins matured?



Yeah, that was part of it, too.
full member
Activity: 196
Merit: 100
H20 whats up today..   Looks like it could be a record Payout?  I am already almost up to what my normal daily payout is and its not even 11am..   Shocked

 I guess all those coins matured?

legendary
Activity: 2156
Merit: 1131
I don't know why but I could never make this pool work more than few minutes without getting my rig getting shutdown.

I gave up for that reason and also because the host make a huge profit on you. Last but not least, a bot will never trade as good as yourself.

The bot just dump the coins. By just selling your coins a bit higher or at the good moment you make about 30% difference. Nothing hard.

I heard about 270% profit compared to bitcoin. It is small. Without switching altcoin, and without spending more than 10 minutes a day on trading I get to 300% at least.


legendary
Activity: 1537
Merit: 1005
Topic over I hope.

H2 can we get btc/day statistics posted again please? Wink
sr. member
Activity: 406
Merit: 250
So am I effected by the High Dif and if so, how?

Just trying to understand how it affects me…

It doesn't affect anyone, but if it did, then it would affect everyone. Smiley
newbie
Activity: 28
Merit: 0
The following is my final word on this subject. I will shut up forever about it, after this. You are free to pick apart my script, find all the logical flaws with it, modify it, publicly shame/praise it. I don't care.

A lot of people aren't satisfied with theoretical equations. So I created a simulation. I wrote a script in python, it does the following things.

Takes 3 input variables.

  • Average block solve time (a result of the pools hashrate and the network difficulty of the current coin)
  • Worker 1 speed (your slower worker. represents the workers's hashrate in relation to the rest of the pool)
  • Worker 2 speed (your faster worker. same)
  • The sample size. This reduces the random variance over time if high

For each number of sample size, I run one instance of the simulation. Each one represents 1 block of a coin.

I generate a solve time for that block. Our constant variable is the block solve time, which represents an average. I generate a random number between 1/2 of it, and 1.5x of it, this is the solve time for this particular block. Remember, by the very definition of the word average, future values will be evenly balanced across both sides.

I then run a separate simulation, using the same solve clock, for both workers.

For each worker, I generate a random value which is 1/2 to 1.5x of their share solve time. Remember, these values don't have to be real, since all we care about is the relation to the other worker.

I check and see if the value is less than the clock. If it is, I credit the worker with 1 share, and subtract the share solve time from the clock time. I do this until the solve time finally becomes greater than the remaining clock.

Thus, I have simulated the number of shares that worker got from the block.

I do the same for the other worker, who has a faster share-solve-time.

The rest is just calculating and displaying statistics.

Heres the code:

import random

class worker():
    def __init__(self,hashrate):
        self.hashrate = hashrate
        self.sharesolvetime = 60 / hashrate
        self.shares = 0

class pool():
    def __init__(self,blockfindtime):
        self.blockfindtime = blockfindtime

pool1 = pool(500)
worker1 = worker(1)
worker2 = worker(12)
samplesize = 100000

for n in range(0,samplesize):
    clock = random.randint(pool1.blockfindtime/2,pool1.blockfindtime + pool1.blockfindtime/2)
    clock1 = clock
    while clock1 > 0:
        sharesolve = random.randint(worker1.sharesolvetime/2,worker1.sharesolvetime + worker1.sharesolvetime/2)
        if sharesolve > clock1:
            break
        else:
            worker1.shares = worker1.shares + 1
            clock1 = clock1 - sharesolve
    clock2 = clock
    while clock2 > 0:
        sharesolve = random.randint(worker2.sharesolvetime/2,worker2.sharesolvetime + worker2.sharesolvetime/2)
        if sharesolve > clock2:
            break
        else:
            worker2.shares = worker2.shares + 1
            clock2 = clock2 - sharesolve
    
print "Worker 1 has: " + str((float(worker1.hashrate) / float(worker2.hashrate + worker1.hashrate)) * 100) + ' percent of the hash power'
print "But worker 1 has: " + str((float(worker1.shares) / float(worker2.shares)) * 100) + ' percent of the profit'
print "Over sample size of " + str(samplesize)
print "When worker1's average share-find-speed was: " + str((float(pool1.blockfindtime) / float(worker1.sharesolvetime)
    

    

It displays the following stats:

  • What percent of the hash power worker1 has
  • What percentage of the profit (shares) he ended up with
  • What sample size we used
  • What was the ratio of worker1's time to find a share to the pool's time to find a block (another way of saying, how "fast" was the coin)

I will now give you the results of running this script. I will use the same workers speeds, but I will change the block solve time. I will give 5 examples

One important point - the block solve time represents the speed of the coin, we can't do anything about that
The share solve time - that's what we want to effect. Two ways to do this: change your hashrate, or change the share difficulty.

Now then... the results...


Very Slow coin (something like LTC):

Worker 1 has: 7.69230769231 percent of the hash power
But worker 1 has: 7.12127534135 percent of the profit
Over sample size of 100000
When worker1's average share-find-speed was: 8.33333333333X the block-find-speed

Pretty Slow coin

Worker 1 has: 7.69230769231 percent of the hash power
But worker 1 has: 6.6950187416 percent of the profit
Over sample size of 100000
When worker1's average share-find-speed was: 4.0X the block-find-speed

Medium/Fast Coin

Worker 1 has: 7.69230769231 percent of the hash power
But worker 1 has: 5.89708931026 percent of the profit
Over sample size of 100000
When worker1's average share-find-speed was: 2.0X the block-find-speed

Fast Coin

Worker 1 has: 7.69230769231 percent of the hash power
But worker 1 has: 4.07045734716 percent of the profit
Over sample size of 100000
When worker1's average share-find-speed was: 1.0X the block-find-speed

Extremely Fast coin

Worker 1 has: 7.69230769231 percent of the hash power
But worker 1 has: 1.15306809456 percent of the profit
Over sample size of 100000
When worker1's average share-find-speed was: 0.5X the block-find-speed

A quick analysis of the results supports the following conclusion:

There is a skew towards faster miners, in terms of their percentage profit compared to their hashrate, on any pool for any coin. The effect gets exponentially increased as the time for the worker to find a share comes closer and closer to the pool block find rate.

This effect is negligible, and irrelevant for slow coins who have slow block time. However, as share-find-time for slower workers approaches block-find speed for fast blocks, miners begin to lose an extreme amount of profits.

This only takes into account block changes, where the client hears about the new block - there are also rejected shares, where it hears about the new block too late, and happens to solve the share before hearing it (this is what is measurable on the site's stats page), and also coin changes.
hero member
Activity: 896
Merit: 1000
Ok..  So as a total Noob who has no clue how Diff really works could someone (Who believes the Dif should be changed) explain who is affected and how they are affected?   I hear stuff like 25% hashing power lost, or no shares found etc..  

When you say people with low Hashing Rigs, please define what low is..   Like 100?  400?  1300?  2600?

The reason I ask is I feel in my one month here so far I did pretty good(Better than if I had just mined Litecoin)

My Rig has 2   7970 cards and my WU in CGMiner stays right between 1300 and 1400.    In CGminer both my cards show low 700’s

MiddleCoin reports me between 1200 to 1500 with 1350 being the Avg

In one Month here I earned right at 1.6 BTC  Or  $195  (Litecoin I calculated I would have made around $130 for one month)

So am I effected by the High Dif and if so, how?

Just trying to understand how it affects me…



You are not affected by the high difficulty as the hash power of individual card is 7xx kh/s. That is higher than most GPU cards.

In high difficulty mining, individual cards matters as a share is mined by a single GPU, not shared by all the GPUs in one rig according to what I learned from this thread.

I only use 7950/70 cards in this pool. All my 5850, 6990 cards are mining LTC in another pooll.
full member
Activity: 196
Merit: 100
Ok..  So as a total Noob who has no clue how Diff really works could someone (Who believes the Dif should be changed) explain who is affected and how they are affected?   I hear stuff like 25% hashing power lost, or no shares found etc..  

When you say people with low Hashing Rigs, please define what low is..   Like 100?  400?  1300?  2600?

The reason I ask is I feel in my one month here so far I did pretty good(Better than if I had just mined Litecoin)

My Rig has 2   7970 cards and my WU in CGMiner stays right between 1300 and 1400.    In CGminer both my cards show low 700’s

MiddleCoin reports me between 1200 to 1500 with 1350 being the Avg

In one Month here I earned right at 1.6 BTC  Or  $195  (Litecoin I calculated I would have made around $130 for one month)

So am I effected by the High Dif and if so, how?

Just trying to understand how it affects me…

member
Activity: 94
Merit: 10
you guys all, are talking about high difficulty and its relation with finding block and income.

I say the opposite side, think about high difficulty and it's relation with reject shares. You lose more when reject shares detected in high difficulty.

Then read the god damn thread.
sr. member
Activity: 490
Merit: 250
you guys all, are talking about high difficulty and its relation with finding block and income.

I say the opposite side, think about high difficulty and it's relation with reject shares. You lose more when reject shares detected in high difficulty.
member
Activity: 94
Merit: 10
AFAIK, there is not SET ANSWER for a block.  You solve shares for the WHOLE NETWORK, and if you solve the block, it gets attributed to the pool.  Blocks aren't given a block to play with and hash until a given share is right.  I could be wrong, but I believe this is the proper explanation.

That's completely irrelevant, this is an analogy about chances, not about uniqueness. Chance is chance. The probability for me to draw a red ball out of an urn with 10 red balls and 990 white balls is exactly the same as the chance to draw a red ball out of an urn with 1 red ball and 99 white balls.

Additionally, what Wolf0 said is also true.
hero member
Activity: 585
Merit: 500
Soo... anyway. Someone might be interested in this. I took h2o's json and made a page you can click on for your individual stats.

http://middlepage.gurutech.ws/stats.php
You can just put in your address like this
http://middlepage.gurutech.ws/stats.php?id=1P4yWhXx7FKtWiGJYZuX5CNEm126iSFuZg
or just click the link and bookmark that. I may have to take it down if boss don't want it there.

might be worth getting this feature added on the main middlecoin site
member
Activity: 60
Merit: 10
Because I can't resist the prospect of getting flamed, here is my analogy.

A guy writes down a number from 1-100 on the back of a postcard.  You guess what that number is, and if you choose the correct number, you win.  You keep guessing as fast as you can.

Every 30 seconds, the guy tears up the postcard and creates another one with a new number.

Can we agree that your previous guesses have no bearing on your chances of choosing the correct number now?  Same when a block changes.

Expanding the analogy to include difficulty makes no difference. You now need to choose a number from 1-500, so it's going to take you longer to find a correct answer. But when the guy tears up the postcard and writes down a new number, your previous guesses does not effect your new guesses.

BTW: I am all for lowering difficulty to reduce *variance*, and I'm sure there is some flaw in that analogy, so flame away Smiley

AFAIK, there is not SET ANSWER for a block.  You solve shares for the WHOLE NETWORK, and if you solve the block, it gets attributed to the pool.  Blocks aren't given a block to play with and hash until a given share is right.  I could be wrong, but I believe this is the proper explanation.

If the above explanation is correct, then the correct analogy would be someone taking 50 pieces of paper, and on each 50 pieces, there is a number between 1-512.  Those 50 pieces of paper are not assigned to anyone.  Pools share all their worker's shares, and if someone from that pool hits a number from the 50 pieces, they get paid.  Pieces of paper aren't given to pools until someone guesses the number on it.

I could be horribly wrong though, but a pool operator told me this some time ago.
Jump to: