Pages:
Author

Topic: 1GH/s, 20w, $700 (was $500) — Butterflylabs, is it for real? (Part 2) - page 78. (Read 146909 times)

rph
full member
Activity: 176
Merit: 100
I'm willing to pay up to $0.75/MH for FPGA-based miners, because going from 400W/GH to 60W/GH is a major savings in power cost, and it's much easier to deal with the heat.

$0.75/MH is pretty much impossible in a fully assembled/tested product using current FPGAs. Maybe next year with 28nm.


-rph
sr. member
Activity: 349
Merit: 250
I doubt there's much noise from such a small fan, but how do they stack? Wouldn't the intakes get blocked if they were stacked? Perhaps they have feet on the bottom that are tall enough to allow incoming airflow.
The web site states 32db as sound level, hopefully, that is the measured sound from a closed unit. If you have a delta fan, you know what I mean.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
60W power would still be advantageous to most people. What about the heat and noise they produce ?

I doubt there's much noise from such a small fan, but how do they stack? Wouldn't the intakes get blocked if they were stacked? Perhaps they have feet on the bottom that are tall enough to allow incoming airflow.

Hey BFL: What are the approximate internal measurements of these pretty cases you are using? (Internal, between the binding posts) I would like to see if one of them could contain a Nano-ITX embedded system.  Would you be willing to sell the cases empty for other projects? Or at least the 3d drawings of them with measurements...
hero member
Activity: 518
Merit: 500
60W power would still be advantageous to most people. What about the heat and noise they produce ?
hero member
Activity: 681
Merit: 500
Oops, I meant W/GH, not W/MH.
hero member
Activity: 756
Merit: 500
Keep in mind that 40W more power only costs you about $40 per year more in electricity at typical US rates. (using 11.4c/kwh to make the numbers come out easy)

I like to use 1 year for projecting whether such a mining device is worth investing in. So go ahead and give me a device that uses 60W instead of 20W, discount the purchase price by $40, and I'm happy. A year from now, surely I'll either be out of bitcoin mining altogether, or will have upgraded to much better hardware anyway.

At this point, I mostly just care about $/MH. I can build GPU rigs at about $0.60/MH, and they have more certain future resale value. I'm willing to pay up to $0.75/MH for FPGA-based miners, because going from 400W/MH to 60W/MH is a major savings in power cost, and it's much easier to deal with the heat. At $2.50/BTC at current mining difficulty at 11.4c/kwh, a $0.75/MH 60W/MH rig recoups its cost in 11.5 months.

In my opinion, ROI is becoming the primary factor in bitcoin mining, not so much W/MH. I think few serious investors want to pour money into mining rigs without a good chance of getting it back in a year.

+1 !!!
hero member
Activity: 681
Merit: 500
Keep in mind that 40W more power only costs you about $40 per year more in electricity at typical US rates. (using 11.4c/KWh to make the numbers come out easy)

I like to use 1 year for projecting whether such a mining device is worth investing in. So go ahead and give me a device that uses 60W instead of 20W, discount the purchase price by $40, and I'm happy. A year from now, surely I'll either be out of bitcoin mining altogether, or will have upgraded to much better hardware anyway.

At this point, I mostly just care about $/MH. I can build GPU rigs at about $0.60/MH, and they have more certain future resale value. I'm willing to pay up to $0.75/MH for FPGA-based miners, because going from 400W/GH to 60W/GH is a major savings in power cost, and it's much easier to deal with the heat. At $2.50/BTC at current mining difficulty at 11.4c/KWh, a $0.75/MH 60W/GH rig recoups its cost in 11.5 months.

In my opinion, ROI is becoming the primary factor in bitcoin mining, not so much W/GH. I think few serious investors want to pour money into mining rigs without a good chance of getting it back in a year.
hero member
Activity: 504
Merit: 500
Decent Programmer to boot!
That's the reassurance I was looking for. Power usage is still way lower than any GPU array can compete with, so there are no complaints there.
OK  Grin Its easy to miss some stuff by accident, not trying to be pissy at you. Wink

Oh no, I fully understand. I didn't take it like you were pissy. I don't actively watch the numbers like other people, I just seem to blurt out what I'm thinking. After I'm told I'm mostly wrong, I go back to monitoring. It is a pretty easy tactic.
sr. member
Activity: 349
Merit: 250
I certainly wouldn't cancel any pre-order about missing this target. Obviously, the closer to the target that better!
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
That's the reassurance I was looking for. Power usage is still way lower than any GPU array can compete with, so there are no complaints there.
OK  Grin Its easy to miss some stuff by accident, not trying to be pissy at you. Wink
hero member
Activity: 504
Merit: 500
Decent Programmer to boot!
I don't mean to be skeptical (still) but all of their numbers appear to be 'projected numbers'. From what they are saying, they are currently not hitting/not capable of hitting these numbers, so they make less optimistic calculations, and again, and again. People probably won't get 1.05 G/H at 20W for quite some time in that case.

umm, i don't mean to spoil your buzz, but:

2.  We're currently running stable in the speed range specified.
3.  The power will likely be higher by an as yet determined margin.

In sum, only one of the three is changing and as stated in our pre-order terms, performance is guaranteed to meet the specifications listed or the order can be canceled for a full refund.

That means that the GH/s are correct, and only the wattage is higher (at this point). To me, the relative power usage is still WAY under what it could be, and is therefore a good deal.

That's the reassurance I was looking for. Power usage is still way lower than any GPU array can compete with, so there are no complaints there.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
I don't mean to be skeptical (still) but all of their numbers appear to be 'projected numbers'. From what they are saying, they are currently not hitting/not capable of hitting these numbers, so they make less optimistic calculations, and again, and again. People probably won't get 1.05 G/H at 20W for quite some time in that case.

umm, i don't mean to spoil your buzz, but:

2.  We're currently running stable in the speed range specified.
3.  The power will likely be higher by an as yet determined margin.

In sum, only one of the three is changing and as stated in our pre-order terms, performance is guaranteed to meet the specifications listed or the order can be canceled for a full refund.

That means that the GH/s are correct, and only the wattage is higher (at this point). To me, the relative power usage is still WAY under what it could be, and is therefore a good deal.
hero member
Activity: 504
Merit: 500
Decent Programmer to boot!
I don't mean to be skeptical (still) but all of their numbers appear to be 'projected numbers'. From what they are saying, they are currently not hitting/not capable of hitting these numbers, so they make less optimistic calculations, and again, and again. People probably won't get 1.05 G/H at 20W for quite some time in that case.
hero member
Activity: 686
Merit: 564
I do not think it is really that "drastic", 20w to 60w doesn't really add up much to the running costs.
It's the difference between being having much better power efficiency than the existing non-vapourware options and having noticably worse efficiency. I'm not sure that this would have gotten quite so much attention if they'd claimed 60 watts in the first place.
hero member
Activity: 504
Merit: 500
Where did the original power projection of 20W come from ?

How come it was changed so drastically and now predicted to be around double the original amount or around 50-60 watts ?

Did you mean that each chip consumes 20W ?
http://butterflylabs.com/products/

  His question was not 'where did the forum speculators get that number from". It was "Where did BFL come up with the original 19.8w?" that is stated as the maximum power draw for the unit...
hero member
Activity: 756
Merit: 500
I do not think it is really that "drastic", 20w to 60w doesn't really add up much to the running costs.
rjk
sr. member
Activity: 448
Merit: 250
1ngldh
Where did the original power projection of 20W come from ?

How come it was changed so drastically and now predicted to be around double the original amount or around 50-60 watts ?

Did you mean that each chip consumes 20W ?
From what I understand, that was the expected load based on 'normal' FPGA work, however mining resulted in higher power usage, which is to be expected to be honest.
sr. member
Activity: 349
Merit: 250
Where did the original power projection of 20W come from ?

How come it was changed so drastically and now predicted to be around double the original amount or around 50-60 watts ?

Did you mean that each chip consumes 20W ?
http://butterflylabs.com/products/
hero member
Activity: 518
Merit: 500
Where did the original power projection of 20W come from ?

How come it was changed so drastically and now predicted to be around double the original amount or around 50-60 watts ?

Did you mean that each chip consumes 20W ?
sr. member
Activity: 349
Merit: 250
The BitForce Saga Act III - The Crisis

Grandson: I just met BFL.

Granny:  Oh, and what did that nice man have to say?

Grandson: He said that they may need change the power specs from 19.8 to betwwen 30 and 60 watt.  But that they are still working on the problem, so time will tell.

Granny: Wh-what!! This throws off all my mining calculations.  Oh, how will I be able to afford my medication from Silk Road?

Grandson: Relax Grandma.  Although we pay $0.20 kWh, at current difficulty, the cost to mine one bitcoin will be $0.30, and likely less.

Granny: OMG, that's triple our original projected cost per bitcoin.

Grandson: Well, these numbers are not too good to be true, therefore, could it be, NOT a scam?

to be continued ?
Pages:
Jump to: