Pages:
Author

Topic: Ethereum mining still profitable? - page 26. (Read 131303 times)

legendary
Activity: 3248
Merit: 1072
May 27, 2016, 08:06:21 AM
I will build 6x390 with the undervoltage is under clock. That is more efficient that the 6x380, but more expensive.
Yes definitively more efficient but I was looking for info about of 6 x 380 VS 5 X 390.
Was wondering Hash Rate/ Power Supply Sizes etc..

Is it  380 x 6 = 20 x 6= 120 MH/s and 1300 Watt PS
and  390 x 5 = 30 x 5= 150 MH/s  and Huh   Watt PS

Just trying to stay away from building with two power supplies.

If you undervolt the R9 390 to -200 mV, it will consume about 160W at 27 MH/s. So for 6, it is 960W, plus system, it is about 1060W. But you can undervolt less, to say -100 mV, it is 230W at 30.5 MH/s.

last time it was 187 for 27mh not 160w, still waiting for pics for this, i provided pic of my little rig doing 20mh with only 124

still no pic for a single 390 doing 27mh with only 160w, i smell bullshit here, it's probably 180-190w
newbie
Activity: 33
Merit: 0
May 27, 2016, 02:42:48 AM
I have 1X7990 + 5x 390, consumes 1080W and 169 MH/S. But the 7990 runs at 900 mV and the 390 runs at 977mV.
hero member
Activity: 2842
Merit: 772
May 27, 2016, 02:08:01 AM
I will build 6x390 with the undervoltage is under clock. That is more efficient that the 6x380, but more expensive.
Yes definitively more efficient but I was looking for info about of 6 x 380 VS 5 X 390.
Was wondering Hash Rate/ Power Supply Sizes etc..

Is it  380 x 6 = 20 x 6= 120 MH/s and 1300 Watt PS
and  390 x 5 = 30 x 5= 150 MH/s  and Huh   Watt PS

Just trying to stay away from building with two power supplies.

If you undervolt the R9 390 to -200 mV, it will consume about 160W at 27 MH/s. So for 6, it is 960W, plus system, it is about 1060W. But you can undervolt less, to say -100 mV, it is 230W at 30.5 MH/s.
hero member
Activity: 672
Merit: 500
May 27, 2016, 01:39:42 AM
I will build 6x390 with the undervoltage is under clock. That is more efficient that the 6x380, but more expensive.
Yes definitively more efficient but I was looking for info about of 6 x 380 VS 5 X 390.
Was wondering Hash Rate/ Power Supply Sizes etc..

Is it  380 x 6 = 20 x 6= 120 MH/s and 1300 Watt PS
and  390 x 5 = 30 x 5= 150 MH/s  and Huh   Watt PS

Just trying to stay away from building with two power supplies.
legendary
Activity: 3500
Merit: 3237
Happy New year 🤗
May 27, 2016, 01:35:15 AM
Only if you do cloud mining its could work yes.
Cloud mining is not worth it.. better to rent vps to start mining ethereum.. and i think this is profitable cloud mining..
Also you can find free vps in google but it needs credit card to verified that you are real person not a bot..
full member
Activity: 213
Merit: 100
May 27, 2016, 12:52:26 AM
So I was thinking of building a few 380 x 6 rigs with a EVGA Gold 1300
Would I be better building a 5 x 390 rig ? What power supply would that take ?
Which is more bang for the buck ?

I will build 6x390 with the undervoltage is under clock. That is more efficient that the 6x380, but more expensive.
full member
Activity: 120
Merit: 100
May 26, 2016, 04:40:10 PM
Only if you do cloud mining its could work yes.
hero member
Activity: 672
Merit: 500
May 26, 2016, 01:37:00 PM
So I was thinking of building a few 380 x 6 rigs with a EVGA Gold 1300
Would I be better building a 5 x 390 rig ? What power supply would that take ?
Which is more bang for the buck ?
legendary
Activity: 3248
Merit: 1072
May 26, 2016, 01:27:43 PM

you are not understading what you are posting

it's obvious that more hash equal to more profit, but this does not mean that the 390 is better than a 970

you need to do thing in the right way, by calculating with the same hash, so put in the calculator this

3x970 at 120w each, 60MH total, vs 2x390 at 230w each, 60MH total, this with your example...

I do understand very well what I am posting.

What I didn't understand was your way of calculating ethereum mining profitability but now I get it. You completely forgot that street price for 970=390 so you need 50% more funding to start. In this case that is 20-25 ETH.

Only a fool would buy 970 instead of 390 for ethereum mining.


well i was talking about efficiency not investment, because after you reach roi, you are not making more with 390

yes they cost equal but initial investment is not so important gpu can be sold later at a good price

and in efficiency they are equal if a 970 can do 20 mega with 125w

So for 3x970, the profit is $6.99, 2X390, profit is 6.76. Not much difference. But 3X970 is 50% more expensive.

sure but my entire point is that they have the same efficiency, people were saying that the 390 can consume less and hash more

but in the end they have the same ratio between hashing and consumption

btw i payed my 970 g1 gaming 240 euro only and i can't find the 390 for the same price even used so...
legendary
Activity: 1176
Merit: 1015
May 26, 2016, 01:15:58 PM

you are not understading what you are posting

it's obvious that more hash equal to more profit, but this does not mean that the 390 is better than a 970

you need to do thing in the right way, by calculating with the same hash, so put in the calculator this

3x970 at 120w each, 60MH total, vs 2x390 at 230w each, 60MH total, this with your example...

I do understand very well what I am posting.

What I didn't understand was your way of calculating ethereum mining profitability but now I get it. You completely forgot that street price for 970=390 so you need 50% more funding to start. In this case that is 20-25 ETH.

Only a fool would buy 970 instead of 390 for ethereum mining.
sr. member
Activity: 296
Merit: 250
May 26, 2016, 12:17:46 PM
So for 3x970, the profit is $6.99, 2X390, profit is 6.76. Not much difference. But 3X970 is 50% more expensive.
legendary
Activity: 3248
Merit: 1072
May 26, 2016, 11:49:47 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.


that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20MH, but 230w isn't 50% more than 120w, but almost 2x more, so it's worse

What I am trying to say is that, this thread being about eth mining profitability, 390 wins 970 hands down 3.38-2.33. Even when I used numbers that favor 970. Only a fool would buy 970 over 390 to eth mining.

Hash/ watt is maybe worth another thread. My 390's kill my 970's on that competition too. My 970's need more than 120w to 20MH and 390's less than 230w to 30MH.


well not my case then, as you can see my 970 can do 20MH with 125w, this is better than a 390 or at least equal, so i don't agree on "only a fool would buy a 970 instead of 390"

i'm talking about g1 gaming, maybe other gpu have low performace because of poor compoent, poor memeory or stuff like that

Look at my example again.

If 970 used ZERO watts it would earn only $2.62/ day. 390 profits $3.38 after 230 watts expenses.

Only a fool would buy a 970 instead of 390 for ethereum mining.


you are not understading what you are posting

it's obvious that more hash equal to more profit, but this does not mean that the 390 is better than a 970

you need to do thing in the right way, by calculating with the same hash, so put in the calculator this

3x970 at 120w each, 60MH total, vs 2x390 at 230w each, 60MH total, this with your example...
legendary
Activity: 1176
Merit: 1015
May 26, 2016, 11:07:59 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.


that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20MH, but 230w isn't 50% more than 120w, but almost 2x more, so it's worse

What I am trying to say is that, this thread being about eth mining profitability, 390 wins 970 hands down 3.38-2.33. Even when I used numbers that favor 970. Only a fool would buy 970 over 390 to eth mining.

Hash/ watt is maybe worth another thread. My 390's kill my 970's on that competition too. My 970's need more than 120w to 20MH and 390's less than 230w to 30MH.


well not my case then, as you can see my 970 can do 20MH with 125w, this is better than a 390 or at least equal, so i don't agree on "only a fool would buy a 970 instead of 390"

i'm talking about g1 gaming, maybe other gpu have low performace because of poor compoent, poor memeory or stuff like that

Look at my example again.

If 970 used ZERO watts it would earn only $2.62/ day. 390 profits $3.38 after 230 watts expenses.

Only a fool would buy a 970 instead of 390 for ethereum mining.
legendary
Activity: 3248
Merit: 1072
May 26, 2016, 10:29:32 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.


that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20MH, but 230w isn't 50% more than 120w, but almost 2x more, so it's worse

What I am trying to say is that, this thread being about eth mining profitability, 390 wins 970 hands down 3.38-2.33. Even when I used numbers that favor 970. Only a fool would buy 970 over 390 to eth mining.

Hash/ watt is maybe worth another thread. My 390's kill my 970's on that competition too. My 970's need more than 120w to 20MH and 390's less than 230w to 30MH.


well not my case then, as you can see my 970 can do 20MH with 125w, this is better than a 390 or at least equal, so i don't agree on "only a fool would buy a 970 instead of 390"

i'm talking about g1 gaming, maybe other gpu have low performace because of poor compoent, poor memeory or stuff like that
legendary
Activity: 1176
Merit: 1015
May 26, 2016, 10:18:50 AM

No GTX 970 uses only 120 watts my god stop with the missinformation, bad info like this lead me to buy 18 gtx 970s of different brands and guess what i found ?

they all pull on average 150 to 170 watts to get 20mhs. If you enable SMI you start seeing wattage in the 190s-200s to get 22 watts.

Belive my im slowing trying to get rid of ALL my gtx 970s since they cant even dual mine and use this much power dont belive these 120 watts clowns i can show pics

if needed with all the cards hooked up to my PDU , six cards i would see 4-5amps which is close to 1200 watts


-People are not stupid btw  they are buying 390s for Eth mining for gtx 970s those in the know whats up, theres a reason why on ethermining AMD cards are recomeded hands down.

I haven't yet seen 970 or 980 that can do 20MH with 120watts but 18MH is really easy.

Would I choose 970 for eth mining over amd? No.
sr. member
Activity: 385
Merit: 250
May 26, 2016, 09:18:13 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.


that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20MH, but 230w isn't 50% more than 120w, but almost 2x more, so it's worse

that is right It depends on the price of Ethereum. If the price drops 50%, then it might be better to use 970.

What I am trying to say is that, this thread being about eth mining profitability, 390 wins 970 hands down 3.38-2.33. Even when I used numbers that favor 970. Only a fool would buy 970 over 390 to eth mining.

Hash/ watt is maybe worth another thread. My 390's kill my 970's on that competition too. My 970's need more than 120w to 20MH and 390's less than 230w to 30MH.

legendary
Activity: 1176
Merit: 1015
May 26, 2016, 08:55:05 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.


that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20MH, but 230w isn't 50% more than 120w, but almost 2x more, so it's worse

What I am trying to say is that, this thread being about eth mining profitability, 390 wins 970 hands down 3.38-2.33. Even when I used numbers that favor 970. Only a fool would buy 970 over 390 to eth mining.

Hash/ watt is maybe worth another thread. My 390's kill my 970's on that competition too. My 970's need more than 120w to 20MH and 390's less than 230w to 30MH.
legendary
Activity: 3248
Merit: 1072
May 26, 2016, 02:04:44 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.




that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20%, but 230 isn't 50% more than 120w, but 2x more, so it's worse


No GTX 970 uses only 120 watts my god stop with the missinformation, bad info like this lead me to buy 18 gtx 970s of different brands and guess what i found ?

they all pull on average 150 to 170 watts to get 20mhs. If you enable SMI you start seeing wattage in the 190s-200s to get 22 watts.

Belive my im slowing trying to get rid of ALL my gtx 970s since they cant even dual mine and use this much power dont belive these 120 watts clowns i can show pics

if needed with all the cards hooked up to my PDU , six cards i would see 4-5amps which is close to 1200 watts


-People are not stupid btw  they are buying 390s for Eth mining for gtx 970s those in the know whats up, theres a reason why on ethermining AMD cards are recomeded hands down.

misinformation you say?



before you say that smi it's not accurate, i have tested with wattmeter and it report the same
legendary
Activity: 2408
Merit: 1102
Leading Crypto Sports Betting & Casino Platform
May 26, 2016, 01:40:50 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.




that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20%, but 230 isn't 50% more than 120w, but 2x more, so it's worse


No GTX 970 uses only 120 watts my god stop with the missinformation, bad info like this lead me to buy 18 gtx 970s of different brands and guess what i found ?

they all pull on average 150 to 170 watts to get 20mhs. If you enable SMI you start seeing wattage in the 190s-200s to get 22 watts.

Belive my im slowing trying to get rid of ALL my gtx 970s since they cant even dual mine and use this much power dont belive these 120 watts clowns i can show pics

if needed with all the cards hooked up to my PDU , six cards i would see 4-5amps which is close to 1200 watts


-People are not stupid btw  they are buying 390s for Eth mining for gtx 970s those in the know whats up, theres a reason why on ethermining AMD cards are recomeded hands down.
legendary
Activity: 3248
Merit: 1072
May 26, 2016, 01:32:27 AM

you can not compare like this, because different card have different consumption

i mean your card probably have two fan only and less component than a g1 gaming

i just tested and i can do 20MH with only 124w, now from those 124 you need to remove the fact that g1 gaming have 3 fan and a more power hungry pcb, because of the bigger heatsink


With same type of memory, core&mem speed they all have more or less the same power consumption.

Bigger than 6+6pin power connectors just let you use more power and bigger heatsinks and more fans give tools for getting rid of that increased heat.

980 G1 is just as efficient at 20MH as Strix 970.


that "more or less" can make the difference here, since we are talking about a very small difference, about 10w per card

this is what i'm talking about

If you pay $0.1/ Kwh your mining rev/ profit would be (according to whattomine):

GTX970 20MH/120w $2.62/ 2.33 per day
AMD390 30MH/230w $3.93/ 3.38 per day

I just can't see how 10w could be a gamechanger here.


30MH at 230w isn't better than 20MH at 120w, 0.13 vs 0.16 ratio

30MH is only 50% more than 20MH, but 230w isn't 50% more than 120w, but almost 2x more, so it's worse
Pages:
Jump to: