Author

Topic: CCminer(SP-MOD) Modded NVIDIA Maxwell / Pascal kernels. - page 485. (Read 2347659 times)

legendary
Activity: 3164
Merit: 1003
Thanks for your answers guys, all your advice is welcome.
try to use all identical cards.
i'll go for a 1070, if you sell your 3 750Ti, that should pay for about 50% of what a 1070 costs.
that's what i did, i sell all my 750ti's (i had like 30  Grin ) and buy 1070's my power bill decreased like 30%  Grin

750ti where fantastic cards, the best ones i ever used, but their time for mining is over IMHO.

The number checks out.

On one hand there's the difference in fabrication process which says a lot:
GTX 750/750Ti/760/770/780/780Ti/Titan/Titan Z - 28nm
GTX 950/960/970/980/980 Ti/Titan X - 28 nm
GTX 1060/1070/1080/Titan X(P) - 16 nm
GTX 1050/1050 Ti - 14 nm

But, every card has their advantages and disadvantages (e.g. different efficiency) on a per algo basis.
Considering how old the 750 Ti's are, I'm still making ~70% of what I make with 1070's watt for watt.

But it's not as easy to sell them.

I didn't know the 1050 Ti was 14 nm. That's great. Looking forward for the higher end cards.
legendary
Activity: 1764
Merit: 1024
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.

It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.

Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.
legendary
Activity: 2002
Merit: 1051
ICO? Not even once.
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.
sr. member
Activity: 506
Merit: 252
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh
legendary
Activity: 2002
Merit: 1051
ICO? Not even once.
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.

newbie
Activity: 11
Merit: 0
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact

Not correct. Even +400 can lead to severe artifacts.

Better not to push beyond +300 for the gddr5x of the pascal architecture, regardless of the ram producer, and ns amounts.

i'm fine at 400, no artifact here, but i agree that overclocking mem is dangerous i don't like it for long terms

Mem O/C is affected by memory type, NVIDIA sneakily switched many newer 1070 cards to Micron 8GB GDDR5, which have BIOS issues and some of which can't do +400 . Samsung memory will usually hit +600 no problems. I think all RX 480 8GB cards use the same Samsung memory. IMO Overclocking should be fine if the memory is cooled properly, which is sometimes not the case like with the recent EVGA fiasco, the rated temperature of GDDR5 is like 85c.

Thanks for your answers guys, all your advice is welcome.
try to use all identical cards.
i'll go for a 1070, if you sell your 3 750Ti, that should pay for about 50% of what a 1070 costs.
that's what i did, i sell all my 750ti's (i had like 30  Grin ) and buy 1070's my power bill decreased like 30%  Grin

750ti where fantastic cards, the best ones i ever used, but their time for mining is over IMHO.

The number checks out.

On one hand there's the difference in fabrication process which says a lot:
GTX 750/750Ti/760/770/780/780Ti/Titan/Titan Z - 28nm
GTX 950/960/970/980/980 Ti/Titan X - 28 nm
GTX 1060/1070/1080/Titan X(P) - 16 nm
GTX 1050/1050 Ti - 14 nm

But, every card has their advantages and disadvantages (e.g. different efficiency) on a per algo basis.
Considering how old the 750 Ti's are, I'm still making ~70% of what I make with 1070's watt for watt.

But it's not as easy to sell them.


I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.
legendary
Activity: 1764
Merit: 1024
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact

Not correct. Even +400 can lead to severe artifacts.

Better not to push beyond +300 for the gddr5x of the pascal architecture, regardless of the ram producer, and ns amounts.

GDDR5-X is 1080 only. All the other Pascal models use GDDR5. All the other Pascal models memory OC and operates in the same way. It's only the 1080 with the weird ass memory.

If you're getting artifacting, your system isn't stable even if it is short term for mining. +600 works best for me. I can only get +800 stable on some cards. Artifacting isn't any worse for a card then normally OCing, you're just seeing the 'errors' on the card, which you normally can't. All OCs eventually run into errors, it's about minimizing how often it finds them.
legendary
Activity: 2002
Merit: 1051
ICO? Not even once.
Thanks for your answers guys, all your advice is welcome.
try to use all identical cards.
i'll go for a 1070, if you sell your 3 750Ti, that should pay for about 50% of what a 1070 costs.
that's what i did, i sell all my 750ti's (i had like 30  Grin ) and buy 1070's my power bill decreased like 30%  Grin

750ti where fantastic cards, the best ones i ever used, but their time for mining is over IMHO.

The number checks out.

On one hand there's the difference in fabrication process which says a lot:
GTX 750/750Ti/760/770/780/780Ti/Titan/Titan Z - 28nm
GTX 950/960/970/980/980 Ti/Titan X - 28 nm
GTX 1060/1070/1080/Titan X(P) - 16 nm
GTX 1050/1050 Ti - 14 nm

But, every card has their advantages and disadvantages (e.g. different efficiency) on a per algo basis.
Considering how old the 750 Ti's are, I'm still making ~70% of what I make with 1070's watt for watt.

But it's not as easy to sell them.
legendary
Activity: 3248
Merit: 1072
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact

Not correct. Even +400 can lead to severe artifacts.

Better not to push beyond +300 for the gddr5x of the pascal architecture, regardless of the ram producer, and ns amounts.

i'm fine at 400, no artifact here, but i agree that overclocking mem is dangerous i don't like it for long terms
sr. member
Activity: 445
Merit: 255
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact

Not correct. Even +400 can lead to severe artifacts.

Better not to push beyond +300 for the gddr5x of the pascal architecture, regardless of the ram producer, and ns amounts.
legendary
Activity: 3164
Merit: 1003
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact
Good.... I'll have to remember that as soon as my 1070 arrives.  Thx
legendary
Activity: 1134
Merit: 1001
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact

Understand! thanks , i have increase speed with nicehash , set +400 mem now , now 210 I/s 405 sol with 2x gtx 1070 , PL 50% .
legendary
Activity: 3248
Merit: 1072
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?

+400 but on a 1070 i was not talking about the 750

i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .

up to 600 it's ok, but above 600 not so much for certain 1070 models, there is risk of artifact
legendary
Activity: 3164
Merit: 1003
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
You mean  -400 or +400 mem Amph?
legendary
Activity: 1134
Merit: 1001
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate

+400 mem its not enough high ?

What is the limit for nvidia gtx 1070 gigabyte .
legendary
Activity: 3164
Merit: 1003
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
Yes overclock the core +100 or more. I have all mine overclocked to match 1345 mhz. The 750ti's sweet spot. Smiley

legendary
Activity: 3248
Merit: 1072
i found that +100 core 55% tdp and 400 mem is better than -502 mem and 55% tdp with 0 core, same wattage but more hashrate
legendary
Activity: 3164
Merit: 1003
I sell my 30 x 750Ti before the zcash bubble   Cry  otherwise i will probably still using them but, if we talk about efficiency, there is nothing more efficient than the GTX 1070, maybe... only maybe..the GTX 1060, but nothing else comes close, 750Ti are too old now.

i didn't do the math lately but nicehash was paying garbage for zcash, even with the efficiency of their private miner, dunno how are doing right now.

it is so weird to me than most of the people don't care about the power consumption, when it is the single most important factor in mining.
weird... Huh

i have an ac power meter in every single rig, and one big ass one with data output that i access remotely (has ethernet) to read how my rigs are doing in every location.

it is the only way i can keep the power utility at levels like today (8% of my production only, with amd it was like 40% of the production!)

Did you underclock the mem to -502 and oc the core on the 750ti's and put tdp to 77 and  look at the watt meter?
Yes the 1070's are better.
I just bought one to begin switching over to the newer higher efficiency cards.
legendary
Activity: 1176
Merit: 1015

it is so weird to me than most of the people don't care about the power consumption, when it is the single most important factor in mining.
weird... Huh

levels like today (8% of my production only, with amd it was like 40% of the production!)


No it is not, most important factor is profit. If 60% of amd production gives you more profit than 92% of nvidia production there is no point in saving power. Yes I know what you mean and agree with you but with current profitability levels and average electricity prices there is no point to go to ultimate power savings mode.

There is a new 'generation' of miners, they need some time to learn basic things, then we can read more about power saving.

Where I live street price for 750ti is still 70-80€, that is like 250 times daily profit of 750ti. I would sell them fast, instant cash and free pcie slots, sold mine one year ago already.

edit: 750ti owners, do those whattomine defaults for 750ti look right to you?
hero member
Activity: 710
Merit: 502
I sell my 30 x 750Ti before the zcash bubble   Cry  otherwise i will probably still using them but, if we talk about efficiency, there is nothing more efficient than the GTX 1070, maybe... only maybe..the GTX 1060, but nothing else comes close, 750Ti are too old now.

i didn't do the math lately but nicehash was paying garbage for zcash, even with the efficiency of their private miner, dunno how are doing right now.

it is so weird to me than most of the people don't care about the power consumption, when it is the single most important factor in mining.
weird... Huh

i have an ac power meter in every single rig, and one big ass one with data output that i access remotely (has ethernet) to read how my rigs are doing in every location.

it is the only way i can keep the power utility at levels like today (8% of my production only, with amd it was like 40% of the production!)
Jump to: