Okay, I have 12 RX470s running. Lets assume I have these mining scenarios..
XMR @ 7890 H/s @ 665w
ETH @ 312 H/s @ 930w
Using current rates...
XMR will earn $10.66 @ day and cost $3.53 in electricity = $7.13
ETH will earn $11.94 per day and cost $4.91 in electricity = $7.03 (-$0.10)
Pretty close right... So my coin picker is going to tell me to mine XMR, but I'm seriously questioning this given that its winter here right now.
Lets assume the heat my rigs produce are worth 30% as supplemental heat credit (electric bill goes up alot, gas bill goes down a little).
XMR will earn $10.66 @ day and cost ($3.53*70%=$2.47) in electricity = $8.19
ETH will earn $11.94 per day and cost ($4.91*70%=$3.43) in electricity = $8.51 (+$0.32)
So ETH can earn me a bit more in winter because it had more usable watts which 30% of them converted into heat off-setting the need for extra gas.
So my question is, does anyone have a good idea or a guestimate as to what the supplemental heat efficiency percentages really are? I've read electric heaters are pretty efficient, and thats really what we have here. And placing mining rigs in rooms you use most can help even more.. (
http://cadetheat.com/blog/how-efficient-are-electric-heaters/)
In the summer, we could do the same exercise, and instead of heat credits, you apply cooling penalty.. for consistency lets just say thats also 30%
XMR will earn $10.66 @ day and cost ($3.53*130%=$4.59) in electricity = $6.07
ETH will earn $11.94 per day and cost ($4.91*130%=$6.38) in electricity = $5.56 (-$0.51)
So if XMR and ETH produce similar rates, it makes more sense to run XMR in summer and ETH in winter. Or move rigs to garage and put ~150w box fan on them which would cost more like $0.40/day instead of $1.06/day for extra indoor cooling.
So, what do you think the electricity to heat efficiency of a GPU mining rig is? And the alternative, what is cooling penalty?