Pages:
Author

Topic: Break even difficulty by hardware efficiency (power cost = value of BTC) - page 2. (Read 18877 times)

member
Activity: 84
Merit: 10
1 HP ≈ 746 watts

1 HP ≈ Burning 641 calories an hour

300 calories = 75kg person, cycling at <10 mph, for an hour = (746/641) * 300 watts = 349.14 watts = 349.14 Joules/second = 384 GHash ≈ 2 bitcents @ 3 PHash ≈ $4 @ $200 per BTC

So, with an appropriate exercise machine attached to a miner, your 1 hour workout would also give you some Bitcoin rewards. Cool

Then you would have to order pizza for BTC to replenish calories lost on mining. Just like an ordinary coal miner :-)
sr. member
Activity: 405
Merit: 255
@_vjy
1 HP ≈ 746 watts

1 HP ≈ Burning 641 calories an hour

300 calories = 75kg person, cycling at <10 mph, for an hour = (746/641) * 300 watts = 349.14 watts = 349.14 Joules/second = 384 GHash ≈ 2 bitcents @ 3 PHash ≈ $4 @ $200 per BTC

So, with an appropriate exercise machine attached to a miner, your 1 hour workout would also give you some Bitcoin rewards. Cool
sr. member
Activity: 405
Merit: 255
@_vjy
According to thegenesisblock.com none of the current machines will pay for themselves.... What do you say about that?

May be fiat-in can be paid out as fiat-out. $5k miner + $1k electricity in >> $6k out. I'm sure no one is getting their BTC out; means 100 btc in >> 50 btc. You should consider yourself lucky if exchange rate was doubled.

Now, hoarding becomes a better investment than investing in miners..

hero member
Activity: 696
Merit: 500
According to thegenesisblock.com none of the current machines will pay for themselves.... What do you say about that?
sr. member
Activity: 405
Merit: 255
@_vjy
By taking $$$ out of equation, your calculations are based on, 1 mBTC/kWh ($0.1 per kWh, at $100 per BTC), or 500 uBTC/kWh ($0.1 per kWh, at $200).

If some solar / wind (or horse-powered) power companies jump into Bitcoin mining and offer electricity priced in BTC, then Bitcoin mining would look entirely different.

These companies can list their kW shares in an exchange for trading, like cex.io GHS. Cool

I am so looking forward to see this happen.. my prediction / guesstimate; this is going to happen in 2014-Q4. Smiley

May be 22nm / 20nm ASICs could slow this down for another quarter, or two.
donator
Activity: 1218
Merit: 1079
Gerald Davis
At 0.9 Joules / GHash, 10 PHash server farm requires, 2.5 kWh?!

I must be wrong somewhere. Huh

Is it 0.9 joules / second / GHash?

In that case, it would be 9000 kWh.

kWh is a measure of energy (power over time).
kW is a measure of power.

1 kW for 1 hour = 1 kWh


10 PH/s farm
0.9 J/GH * 10,000,000 GH/s = 9,000,000 J/s = 9,000,000 W = 9,000 kW

So yes 9,000 but it is kW not kWh.  Now if you ran your 9,000 kW farm for 1 hour it would use 9,000 kWh of energy.  
In a year that 9,000 kW farm would use 9,000 * 24 * 365 = 788 million kWh.  At $0.10 per kWh that would be $78.8 million in energy cost.

If the Joule conversion confuses you 0.9 J/GH = 0.9 W/GH/s but that looks ugly. Smiley
sr. member
Activity: 405
Merit: 255
@_vjy
At 0.9 Joules / GHash, 10 PHash server farm requires, 2.5 kWh?!

I must be wrong somewhere. Huh

Is it 0.9 joules / second / GHash?

In that case, it would be 9000 kWh.
newbie
Activity: 15
Merit: 0
That would put the reported DC output of the module higher (430W ) than the computing DC input (385W).  Something isn't correct.   So either your wall wattage numbers or the output reported by the VRM is incorrect, they both can't be right.  Under ideal conditions (no cooling or host power consumption), 90% DC efficiency, 93% ATX PSU efficiency.  430W output would mean (430/(0.9*0.93) = 513W) >500W at the wall.

next time I'll go to the colo I will measure at the wall wattage again.

Quote
Power Supply model.

I don't remember if I've already said mine is a cooler master v850.


Has anyone run firmware 0.97 with the bertmod files? If so, I'll run this and give you guys watts at the wall vs watts output at the VRM.

http://forum.kncminer.com/forum/main-category/main-forum/6183-bertmod-0-2-unofficial-firmware-mod-feedback-thread?p=9442#post9442

It works.
legendary
Activity: 1442
Merit: 1001
That would put the reported DC output of the module higher (430W ) than the computing DC input (385W).  Something isn't correct.   So either your wall wattage numbers or the output reported by the VRM is incorrect, they both can't be right.  Under ideal conditions (no cooling or host power consumption), 90% DC efficiency, 93% ATX PSU efficiency.  430W output would mean (430/(0.9*0.93) = 513W) >500W at the wall.

next time I'll go to the colo I will measure at the wall wattage again.

Quote
Power Supply model.

I don't remember if I've already said mine is a cooler master v850.


Has anyone run firmware 0.97 with the bertmod files? If so, I'll run this and give you guys watts at the wall vs watts output at the VRM.
legendary
Activity: 1260
Merit: 1008
That would put the reported DC output of the module higher (430W ) than the computing DC input (385W).  Something isn't correct.   So either your wall wattage numbers or the output reported by the VRM is incorrect, they both can't be right.  Under ideal conditions (no cooling or host power consumption), 90% DC efficiency, 93% ATX PSU efficiency.  430W output would mean (430/(0.9*0.93) = 513W) >500W at the wall.

next time I'll go to the colo I will measure at the wall wattage again.

Quote
Power Supply model.

I don't remember if I've already said mine is a cooler master v850.

hero member
Activity: 518
Merit: 500
Manateeeeeeees
Can we update to $200/BTC now?
legendary
Activity: 1260
Merit: 1008
Thanks for the datapoints.  It is strange the reported DC/DC output doesn't change.

~430W out regardless however the input wattage changes significantly.

Need to make some assumptions but lets say the 6 fans use 6W ea (someone can look at the fan sticker) and the host uses another 5W.   So balance of system is ~40W @ 12VDC.  Lets also assume your PSU is 90% efficient at

v0.90 = 1650W @ 220VAC ~= 1485W DC @ 12VDC  (1485 - 40)/2 =  722W.
v0.95 =  946W @ 220VAC ~= 850W DC @ 12VDC  (850-40)/2 = 405W.

v0.90 VRM In: 722W Out: 433W Efficiency: 60% OUCH
v0.95 VRM In: 405W Out: 438W Efficiency: IMPOSSIBLE.  

So either your numbers or incorect or the output reported by the VRM is incorrect.   It is possible the PSU efficiency in the second case was slighly higher (say 92%) and the host power usage is less but those numbers don't change things significantly.  Looking at it the other way the VRM is at most 90% efficient.  438W out = 486W in (@ 90 efficiency) or 962W @ 12VDC for both.  Even with no fans or host wattage the AC wattage doesn't match the reported VRM output wattage even under ideal conditions (90% efficiency @ VRM, 92% efficiency @ PSU).    



sorry but I think I didn't explain the situation properly (I'm not a native english speaker, plus my eng is a little bit rusty).

I own 2 jupiters and all the ampere measurments reported take into account both of them.  

With 0.90 together the 2 jups drain 7.5 ampere @ 220V. For each of them at the wall I get more or less 825 Watts .

Unfortunately I didn't get a chance to use bertmod while using 0.90 (It didn't even exist maybe).

With 0.95  together the 2 jups  drain  4.4 Ampere @ 220 V.  circa 473 Watts each.

The ouput I get from bertmod reported in the prev post is related to the jupiters while running fw 0.95 (jup 1 = 438 , jup 2 = 433)


 
edit: fix a few typos
donator
Activity: 1218
Merit: 1079
Gerald Davis
Thanks for the datapoints.  It is strange the reported DC/DC output doesn't change.  ~430W out regardless however the input wattage changes significantly.
On edit: fixed misunderstanding of numbers reported. Both output DC numbers are v0.95.  There are no DC output numbers for v0.90.

Need to make some assumptions but lets say the 6 fans use 6W ea (someone can look at the fan sticker and let me know wattage or amps) and the host uses another 5W.  That puts the balance of the system at ~40W.    Lets also assume your PSU is 90% efficient at 220V pretty reasonable for 80 Plus Gold unit over most of its operating range (208V-240V tends to be 1% to 2% more efficient than 120V).

v0.95 =  473W @ 220VAC ~= 425W DC @ 12VDC  (425-40) = 385W Input for VRM (96W per module).

That would put the reported DC output of the module higher (430W ) than the computing DC input (385W).  Something isn't correct.   So either your wall wattage numbers or the output reported by the VRM is incorrect, they both can't be right.  Under ideal conditions (no cooling or host power consumption), 90% DC efficiency, 93% ATX PSU efficiency.  430W output would mean (430/(0.9*0.93) = 513W) >500W at the wall.

Still lets look at the wall efficiency.
v0.90 825W / 495 GH/s = 1.7 J/GH  OUCH. Smiley
v0.95 473W / 495 GH/s = 1.0 J/GH

If anyone else has KNC datapoints please provide the following:
KNC Mode.
Average hashrate.
Wattage at the wall.
Firmware version.
# of VRMs present (4 or 8 ).
Power Supply model.
Mains Voltage (120V, 208V, 220, 240V, etc).

Thanks.

legendary
Activity: 1260
Merit: 1008
Update (improved) KNC to 1.0 J/GH based on reported 455W @ 12 VDC running at 502 GH/s.  Assume 90% AC power supply efficiency results in estimated 505W at the wall.  Also moved KNC into the "actual devices" category.  Bitfury still holds the efficiency crown though.

https://bitcointalksearch.org/topic/m.3307091

On edit: revised slightly to 1.1 J/GH.  The 455W was the output (0.75V) of the DC to DC converter.  Based on GE spec sheet the input current would be ~10% higher.  That puts it closer to 560W at the wall.  This appears to be inline with other customer reports using kill-a-watt type meters.  On some firmware and with the 4 VRM model the wattage is significantly higher ~1.3 J/GH.

just to add a few datapoints.

I own 2 jups both with a 8 VRMs per PCB and a faulty asic module with a die with all 48 cores disabled each.

At the pool I got something like 490-95 GH/s each.
 
with fw 0.90 the power consumption combined was 7.5 Ampere @ 220 V at the wall --> 1650 Ws  = 825 Ws each.

with fw 0.95 the hashrate remains the same and I got 4.4 Ampere @ 220 V at the wall --> 946 Ws = 473 Ws each.

bertmod reports that ASIC slots total DC/DC power output is 433 and 438 Ws  respectively.

I user the same PSU model for both: Cooler Master V850.

full member
Activity: 249
Merit: 100
I think a likely outcome of corporate hashrate monopolisation would be the death of crypto. It's just not a viable way to go because it would remove fundamental human interest in the entire concept in these early years.  The technologists of the world, who are you and I, would leap off in droves, and the concept would die.  Because we make it, in the end.

I don't doubt that big capital is putting more in as time goes on, but if they go too far and take over what isn't settled yet, they'll end up with nothing.

The thing about Bitcoin is that so far (and probably for quite a long time yet to come) it is an alternative, we all can just dump it at any time and resume our normal fiat lives, should we become sufficiently disillusioned with it for any reason.
hero member
Activity: 924
Merit: 1000
donator
Activity: 1218
Merit: 1079
Gerald Davis
Update (improved) KNC to 1.0 J/GH based on reported 455W @ 12 VDC running at 502 GH/s.  Assume 90% AC power supply efficiency results in estimated 505W at the wall.  Also moved KNC into the "actual devices" category.  Bitfury still holds the efficiency crown though.

https://bitcointalksearch.org/topic/m.3307091

On edit: revised slightly to 1.1 J/GH.  The 455W was the output (0.75V) of the DC to DC converter.  Based on GE spec sheet the input current would be ~10% higher.  That puts it closer to 560W at the wall.  This appears to be inline with other customer reports using kill-a-watt type meters.  On some firmware and with the 4 VRM model the wattage is significantly higher ~1.3 J/GH.
donator
Activity: 1218
Merit: 1079
Gerald Davis
legendary
Activity: 2126
Merit: 1001
Quote
Why did you make this?

I find it silly people have projections with difficulty going to 1 trillion or more (I think the highest I have seen is 200 trillion). At a mere 300 billion difficulty all current and proposed mining devices would be operated at a significant loss (assuming $100 exchange rate & $0.10 electrical rate).  So at 1 trillion in difficulty even a Cointera rig would be converting $4 in electricity into $1 in Bitcoins.  Anyone think that is likely?   Significantly higher difficulty is going to require either more efficient hardware, a higher exchange rate, or the average cost of power for the network to decline.

Interesting numbers. We can understand that home / basement / garage based miner's time has come to an end, or will end soon. Considering a data center environment, where hosting / rent is calculated based on 'Rack unit', I am wondering how these numbers would be calculated.

Total network hashrate is still a unpredictable. Is it possible to say an investment amount (miner + hosting cost) will break even in 6 months?


Not necessarily.
Home based miners don't calculate costs for manpower/administration, will often not use AC (which doubles the electricity price) and might more easily mine on a very thin or negative margin. And then there's an army of people with no (self paid) electricity costs at all.
But yes, those groups probably won't ever reach a percentage of the total mining power as in CPU or GPU times..

Ente
sr. member
Activity: 405
Merit: 255
@_vjy
Quote
Why did you make this?

I find it silly people have projections with difficulty going to 1 trillion or more (I think the highest I have seen is 200 trillion). At a mere 300 billion difficulty all current and proposed mining devices would be operated at a significant loss (assuming $100 exchange rate & $0.10 electrical rate).  So at 1 trillion in difficulty even a Cointera rig would be converting $4 in electricity into $1 in Bitcoins.  Anyone think that is likely?   Significantly higher difficulty is going to require either more efficient hardware, a higher exchange rate, or the average cost of power for the network to decline.

Interesting numbers. We can understand that home / basement / garage based miner's time has come to an end, or will end soon. Considering a data center environment, where hosting / rent is calculated based on 'Rack unit', I am wondering how these numbers would be calculated.

Total network hashrate is still a unpredictable. Is it possible to say an investment amount (miner + hosting cost) will break even in 6 months?
Pages:
Jump to: