Good idea adding electric cost but this makes no sense to me
--Electricity cost, you can use this converter to get your BTC cost
http://www.xe.com/es/currencyconverter/convert/?Amount=0%2C012&From=EUR&To=XBT--Example for all day cost [{"HourStart":0,"HourEnd":23,"CostKwhBTC":0.00000105609}]
--Example for two periods cost [{"HourStart":12,"HourEnd":21,"CostKwhBTC":0.00000105609},{"HourStart":22,"HourEnd":11,"CostKwhBTC":0.00000703759}]
@@ELECTRICITYCOST=[{"HourStart":0,"HourEnd":23,"CostKwhBTC":0.00000105609}]
How in the world does this calculate the cost of my electric? It doesn't even take into account my actual cost.
It also doesn't take into account the number of watts per rig.
I pay .10 / Kw in the winter and .14 in the summer
How about something like this
@@ELECTRICITYCOST=.10
@@RIGWATT=830
I don't get it man. I removed that line for now.
Please explain in great detail how this line works
thanks
If no any bug, MM calculates the power of each gpu group and so get your power cost.
Revenue = ("Pool profit per 1 hashrate" * "group Hashrate") - "Miner Fee" - "Pool Fee"
"Electricity cost"= ("gpu group power now" * 24/1000) * "KWhBtc Now"
Profit = Revenue - "Electricity cost"
Gpu power for Nvidia is readed from nvidia-smi
Gpu power for Amd is model based hardcoded (I had found any way to read it) using this model names (please check what model name do you have in current screen or gpulist.bat)
"Radeon RX 580 Series" -->135W * Power Limit * Usage
"Radeon RX 480 Series" -->135W * Power Limit * Usage
"Radeon RX 570 Series" -->120W * Power Limit * Usage
"Radeon RX 470 Series" -->120W * Power Limit * Usage
"Radeon Vega 56 Series"-->210W * Power Limit * Usage
"Radeon Vega 64 Series"-->230W * Power Limit * Usage
I have seen some cards like 1050 nvidia-smi not suport powerdraw reading