Pages:
Author

Topic: Ethereum mining still profitable? - page 57. (Read 131243 times)

legendary
Activity: 1176
Merit: 1015
March 03, 2016, 10:03:03 PM
ETH mining started last summer.

Cudaminer by cbuchner1 started somewhere in early 2014.
No shit, thanks for the history we already knew. Roll Eyes I was talking about my dialog on the subject they were referring to that seemed to get everyone in AMD-land upset for some reason. Genoil's fork for ETH on nVidia only started working well enough a few months ago, unless you know better.  Huh

Oops, sorry!

I was happily mining with genoils miner last summer. Memory overclocked 750ti did 10MH back then (linux).

Red fanboys are just as bad as green ones.

full member
Activity: 150
Merit: 100
caeruleum arca archa
March 03, 2016, 09:26:46 PM
ETH mining started last summer.

Cudaminer by cbuchner1 started somewhere in early 2014.
No shit, thanks for the history we already knew. Roll Eyes I was talking about my dialog on the subject they were referring to that seemed to get everyone in AMD-land upset for some reason. Genoil's CUDA ethminer fork only started working well enough a few months ago, unless you know better.  Huh
legendary
Activity: 1176
Merit: 1015
March 03, 2016, 09:14:54 PM
I don't know why you guys are comparing power when clearly ETH is so profitable that power shouldn't be an issue currently.
You are right, it isn't issue anymore. But if someone suddenly finds a way to cut power costs 30-40% without losing performance on 18 months old platform...
Any idea when this started? I first posted here on January 30 and mentioned it (bad, bad me). ETH was about $2.25 then, so profitability with GTX was much better than AMD when electric cost was a consideration, many people pay much higher rates that made it unprofitable with AMD. So there.

ETH mining started last summer.

Cudaminer by cbuchner1 started somewhere in early 2014.

legendary
Activity: 1176
Merit: 1015
March 03, 2016, 08:34:45 PM
I don't know why you guys are comparing power when clearly ETH is so profitable that power shouldn't be an issue currently.


You are right, it isn't issue anymore. But if someone suddenly finds a way to cut power costs 30-40% without losing performance on 18 months old platform...



Hmm not sure how this statement makes sense.  Unless youre a professional miner with a large warehouse you have a hard limit on how many MB/GPU combos you can install.

say you were to invest 5k for instance you could buy
33 ($150each) R7 370s that produce 16mhs  each for total of 528 mhs drawing 5k or so watts.
15($330 each) 970s that produce 22mhs each for 330 mhs drawing 2k or so watts
15($330 each) r9 390s that produce 28.5 mhs for 427.5 mhs drawing 3600-4000 watts

I dont list the 280s and 280x and 7 series since you can only find used ones now.

most folks have a finite amount of space and power i.e 200AMP service so if you planed to go big you could fit in far more 970s than the AMD cards since you use far less power.

For the normal small miner the 970s seems to be the way to go but if you're a professional miner with lots of space and amperage the 390s seem to be the sweet spot and they are readily available.



Normal small miner should go 970 way, power efficient and really good 1080p gaming card. Easy to sell.

About that power statement,

I can mine ETH with 5 x 970 from 83MH@600W to 110MH@1000W.






full member
Activity: 150
Merit: 100
caeruleum arca archa
March 03, 2016, 08:28:41 PM
I don't know why you guys are comparing power when clearly ETH is so profitable that power shouldn't be an issue currently.
You are right, it isn't issue anymore. But if someone suddenly finds a way to cut power costs 30-40% without losing performance on 18 months old platform...
Any idea when this started? I first posted here on January 30 and mentioned it (bad, bad me). ETH was about $2.25 then, so profitability with GTX was much better than AMD when electric cost was a consideration, many people pay much higher rates that made it unprofitable with AMD. So there.
legendary
Activity: 2408
Merit: 1102
Leading Crypto Sports Betting & Casino Platform
March 03, 2016, 07:42:38 PM
I don't know why you guys are comparing power when clearly ETH is so profitable that power shouldn't be an issue currently.


You are right, it isn't issue anymore. But if someone suddenly finds a way to cut power costs 30-40% without losing performance on 18 months old platform...



Hmm not sure how this statement makes sense.  Unless youre a professional miner with a large warehouse you have a hard limit on how many MB/GPU combos you can install.

say you were to invest 5k for instance you could buy
33 ($150each) R7 370s that produce 16mhs  each for total of 528 mhs drawing 5k or so watts.
15($330 each) 970s that produce 22mhs each for 330 mhs drawing 2k or so watts
15($330 each) r9 390s that produce 28.5 mhs for 427.5 mhs drawing 3600-4000 watts

I dont list the 280s and 280x and 7 series since you can only find used ones now.

most folks have a finite amount of space and power i.e 200AMP service so if you planed to go big you could fit in far more 970s than the AMD cards since you use far less power.

For the normal small miner the 970s seems to be the way to go but if you're a professional miner with lots of space and amperage the 390s seem to be the sweet spot and they are readily available.

legendary
Activity: 1176
Merit: 1015
March 03, 2016, 06:16:47 PM
I don't know why you guys are comparing power when clearly ETH is so profitable that power shouldn't be an issue currently.


You are right, it isn't issue anymore. But if someone suddenly finds a way to cut power costs 30-40% without losing performance on 18 months old platform...

legendary
Activity: 3808
Merit: 1723
Up to 300% + 200 FS deposit bonuses
March 03, 2016, 05:48:07 PM
I don't know why you guys are comparing power when clearly ETH is so profitable that power shouldn't be an issue currently.
legendary
Activity: 1176
Merit: 1015
March 03, 2016, 05:22:25 PM
2 x evga tdp limited takes maybe 280W and the rest of the rig that last 50W.

Completely ignoring the PSU (in)efficiency... lol. That power has to be accounted for somewhere, as I've somewhat crudely done in horseshoes-close fashion; you just can't throw that blame all on the GPU's. Find a mythical 99.9% PSU, then there would be no wiggle room in your assertion.  Grin

Yes I ignored PSU efficiency on purpose. I am using only gold rated (90%) PSU's.

full member
Activity: 150
Merit: 100
caeruleum arca archa
March 03, 2016, 04:45:23 PM
2 x evga tdp limited takes maybe 280W and the rest of the rig that last 50W.

Completely ignoring the PSU (in)efficiency... lol. That power has to be accounted for somewhere, as I've somewhat crudely done in horseshoes-close fashion; you just can't throw that blame all on the GPU's. Find a mythical 99.9% PSU, then there would be no wiggle room in your assertion.  Grin
legendary
Activity: 1176
Merit: 1015
March 03, 2016, 03:30:49 PM
Copypaste from the nvidia link you posted earlier:

Note: The below specifications represent this GPU as incorporated into NVIDIA's reference graphics card design. Clock specifications apply while gaming with medium to full GPU utilization. Graphics card specifications may vary by Add-in-card manufacturer. Please refer to the Add-in-card manufacturers' website for actual shipping specifications.


You don't have reference design card.

You have a card made by what Nvidia calls add-in-card manufacturer.

Your card does not have TDP of reference design card. In your case card has TDP of 170W because EVGA chose to put that kind of bios to your card. EVGA also chose you can boost your TDP 10% to 187W. EVGA also chose that your card takes 75W from pcie1, 75W from pcie2 and the rest from the pcie slot.

So somehow you're saying the power being measured by the software is somehow disconnected from the stated TDP (145W), even though it's being represented as %TDP and requires that number to calculate against. And that the percentage would not change if the TDP were somehow able to be set higher, like 187 as you want. Or, somehow the TDP on these 970 cards can miraculously go 128% of reference TDP, something no 970 I've ever seen can do. 980's can barely come close to that, because they also have a significantly different power handling architecture, which I'm sure you also know.

You still have to deal with the observed power at the wall, and make the numbers fit as you wish without going over. I've only done the best I can do to do just that. If you want to add more to the GPU load, you'll just have to take the same amount away from system power, and there isn't much there to play with if you want to make any sense.

EDIT: Here ya go, the miraculous gamebox that gets 42MH/s on a mere 330W, see if you can make the numbers work to your liking:
ASRock Extreme6
Intel i5 4690K Devil's Canyon (oc'd in bios to 4GHz) w/CoolerMaster D92
8GB GSkill DDR3 2400
WD Black 1TB SATA
Corsair CX850M 80 Bronze
2xEVGA 3975-KR SSC ACX2.0+ (SLI bridge connected/enabled) P0 state for mining, 3800memclock, +50coreclock
Fractal case w/3 fans

Never trust what software says when monitoring maxwell power consumption.

330W is what that rig should take from the wall when mining IMO, you are just underestimating your GPU's consumption. 2 x evga tdp limited takes maybe 280W and the rest of the rig that last 50W. My nvidia rig happens to be quite similar to yours, differences being only 5 x GPU, SSD, gold rated power and 16GB memory. I'm sure that if I shutdown staking wallets, proxy servers and 3 x GPU we are looking at more or less same wattage when mining. If I want I can easily go beyond 1000W with that rig.

Those power pins really matter. Manufacturers at least try to run within specs, 6-pin is rated at 75W and 8-pin at 150W. Yes I know those connectors can deliver much more if GPU asks and PSU delivers but that's another story... Average 6+8 pin model is allowed 225W + slot before bios limit kicks in and some extreme cards have 8+8 pin connectors.

Some serious GTX 970 power consumption:

https://www.techpowerup.com/reviews/Colorful/iGame_GTX_970/25.html






full member
Activity: 150
Merit: 100
caeruleum arca archa
March 03, 2016, 01:02:35 PM
Copypaste from the nvidia link you posted earlier:

Note: The below specifications represent this GPU as incorporated into NVIDIA's reference graphics card design. Clock specifications apply while gaming with medium to full GPU utilization. Graphics card specifications may vary by Add-in-card manufacturer. Please refer to the Add-in-card manufacturers' website for actual shipping specifications.


You don't have reference design card.

You have a card made by what Nvidia calls add-in-card manufacturer.

Your card does not have TDP of reference design card. In your case card has TDP of 170W because EVGA chose to put that kind of bios to your card. EVGA also chose you can boost your TDP 10% to 187W. EVGA also chose that your card takes 75W from pcie1, 75W from pcie2 and the rest from the pcie slot.

So somehow you're saying the power being measured by the software is somehow disconnected from the stated TDP (145W), even though it's being represented as %TDP and requires that number to calculate against. And that the percentage would not change if the TDP were somehow able to be set higher, like 187 as you want. Or, somehow the TDP on these 970 cards can miraculously go 128% of reference TDP, something no 970 I've ever seen can do. 980's can barely come close to that, because they also have a significantly different power handling architecture, which I'm sure you also know.

You still have to deal with the observed power at the wall, and make the numbers fit as you wish without going over. I've only done the best I can do to do just that. If you want to add more to the GPU load, you'll just have to take the same amount away from system power, and there isn't much there to play with if you want to make any sense.

EDIT: Here ya go, the miraculous gamebox that gets 42MH/s on a mere 330W, see if you can make the numbers work to your liking:
ASRock Extreme6
Intel i5 4690K Devil's Canyon (oc'd in bios to 4GHz) w/CoolerMaster D92
8GB GSkill DDR3 2400
WD Black 1TB SATA
Corsair CX850M 80 Bronze
2xEVGA 3975-KR SSC ACX2.0+ (SLI bridge connected/enabled) P0 state for mining, 3800memclock, +50coreclock
Fractal case w/3 fans

what card do you have? maybe you have a reference that is very limited on default, and with the tdp limitation it is even more limited, because 90 otherwise it's not possible with that hashrate you are getting

Like I said, EVGA SSC ACX2.0+ 3975-KR, the numbers still add up, either stock clocks or o/c situation. Begging anyone to come up with the numbers they believe are right that match my observed loads at the wall, which I've also confirmed by checking with another watt meter just to make sure.
legendary
Activity: 1453
Merit: 1011
Bitcoin Talks Bullshit Walks
March 03, 2016, 11:20:53 AM
 "I sometimes make the comparison of a pocket calculator [Bitcoin] versus, say, a general purpose computer [Ethereum].”  Nick Szabo.
https://blog.ethereum.org/2015/10/22/nick-szabo-confirmed-as-keynote-speaker-of-ethereums-devcon1/

Looks like the market is starting to see this aswell Smiley  Great job ethereum!.  Show Bitcoin it can be done:)

Best Regards
Doug
legendary
Activity: 2590
Merit: 1022
Leading Crypto Sports Betting & Casino Platform
March 03, 2016, 04:18:59 AM
legendary
Activity: 2408
Merit: 1102
Leading Crypto Sports Betting & Casino Platform
March 03, 2016, 03:57:44 AM
legendary
Activity: 2408
Merit: 1102
Leading Crypto Sports Betting & Casino Platform
March 03, 2016, 03:55:03 AM
What is going on with ETH right now is CLEARLY NUTS!!! Its like 2013 all over again with LTC going from $1 to $50.

Those who mined in Febraury and held must of made a fortune.

I mined a ton like 5k but only held like 500 really mad at myself
legendary
Activity: 1176
Merit: 1015
March 02, 2016, 05:58:21 PM
full member
Activity: 150
Merit: 100
caeruleum arca archa
March 02, 2016, 05:20:45 PM
Much of the original point I tried to make has evaporated in light of ETH >.015BTC (.017? .019!?! .02!!!), electric costs should be a non-consideration for most. That said:

But that extract fan power consumption is due to the mining as well. So we have to take that in to account.

100% GPU TDP on 970 = 145W

Do the math, break it down, here's the numbers:

Overclocked, maintaining <70C core temp (high fan profile):
gpu1 (incl. fans on card): 108W (75%TDP)
gpu2 (incl. fans on card): 115W (80%TDP)
4 case and 2 cpu fans, hdd, mobo/cpu: 50W (case fans ramped to 60%)
Total so far: 273W
PSU loss: 55W (80% efficiency; 0.2 * 273 = 55W)
273+55 = 328W, or basically what I've monitored on the UPS
42MH/s, 7.8W/MH (total system), 5.3W/MH (gpus only)

Stock, maintaining 75C core temps, stock fan profile, both @65%TDP:
gpu1: 90W
gpu2: 90W
case fans+hdd+mobo+cpu: 30W (silent pc mode)
Total so far: 210W
PSU loss: 42W
Total system: 264W
36MH/s, 7.0W/MH (total system), 5.2W/MH (gpus only)

Something I'm going to try is disabling Windows Aero. Even though it's a headless system and it's not doing anything, some have mentioned it stealing GPU power all the time regardless of use, therefore decreasing hash. I've seen weirder things, especially on Winblows. God, I've got to roll Mint or something.

I've seen the "my 280x will do xxMH on yyyW with zzzV undervolt" before, I still have no clear idea because so many things play a factor in pushing the limit. I've seen people today who own multiple rigs, running mixes of 7950 and 280x, undervolted/clocked/tweaked/whatever, drawing over 3KW to get 350MH/s (9W/MH!), and they're happy with that. To each their own, ETH's at .02BTC, electric ain't much of a matter no mo.

I think I'm done with this particular thread.  Grin Grin Grin Lips sealed Lips sealed Lips sealed Tongue Tongue Tongue  Cheesy Cheesy Cheesy
sr. member
Activity: 462
Merit: 251
March 02, 2016, 05:10:16 PM
Is the DAG algorithm used by Ethereum ASIC resistant? I just saw the ASIC of X11 is coming out in another thread.

.... because the mining will end a few months anyway.



hu? i missed that
can you explain that to me?
legendary
Activity: 3808
Merit: 1723
Up to 300% + 200 FS deposit bonuses
March 02, 2016, 04:20:37 PM
What is going on with ETH right now is CLEARLY NUTS!!! Its like 2013 all over again with LTC going from $1 to $50.

Those who mined in Febraury and held must of made a fortune.
Pages:
Jump to: