Author

Topic: CCminer(SP-MOD) Modded NVIDIA Maxwell / Pascal kernels. - page 484. (Read 2347659 times)

sp_
legendary
Activity: 2954
Merit: 1087
Team Black developer
yiimp is paying 0.00935BTC per day for 1000 MHASH (based on the payouts in the last 24 hours)




0.00935 * 0.3 *   $753 = $2.11

LBRY sp-mod works best on the maxwell cards. 300 mhash on the 980ti. The opensource does around 250mhash. Lbry sp-mod also works good on the Gtx 1060 3gb and 6gb.
sr. member
Activity: 445
Merit: 255
Ethereum is earning on par with Equihash right now, especially considering Nvidia miners are locked to Nicehash's miner. Apparently Ethereum armageddon was moved back to next year.

It would be nice if there was a 'next big thing'. I have heard about anything coming out since Zcash.

LBRY is profitable again with the sp-mod private kernel..

$2.25 per day if you mine with 300MHASH

you're overstimating the amounts, and it's a pretty constant with your statements sp Wink

1.6 on avg. It's better zcash

ps: with the opensource 1.8.3 in any case, a bit better than private sp1 with 1070's (given the same performance but the absence of fees and reconnections)
sp_
legendary
Activity: 2954
Merit: 1087
Team Black developer
Ethereum is earning on par with Equihash right now, especially considering Nvidia miners are locked to Nicehash's miner. Apparently Ethereum armageddon was moved back to next year.

It would be nice if there was a 'next big thing'. I have heard about anything coming out since Zcash.

LBRY is profitable again with the sp-mod private kernel..

$2.25 per day if you mine with 300MHASH
legendary
Activity: 1108
Merit: 1005
Quote
ethereum originally planned to go PoS by summer 2016. then fall. now early 2017.dont hold your breath - IMO the devs and those close to the project are mining it, and advertising a PoS switch "soon" is an excellent deterrent to others from building farms

Is It surely early 2017?
because I found this http://cryptomine.weebly.com/cryptomine-blog/ethereum-to-switch-to-pos-before-november-1-2017
legendary
Activity: 1764
Merit: 1024
Ethereum continues to give the largest % of mining revenue by far

Only if you're a plug and play miner who will all have a very bad time in the next few months and will shake you out of mining - even with very cheap electricity - until the next big thing comes out for a few weeks, maybe months.

Personally, I don't even remember the last time I mined Ethereum.

Ethereum is earning on par with Equihash right now, especially considering Nvidia miners are locked to Nicehash's miner. Apparently Ethereum armageddon was moved back to next year.

It would be nice if there was a 'next big thing'. I have heard about anything coming out since Zcash.
legendary
Activity: 2002
Merit: 1051
ICO? Not even once.
Ethereum continues to give the largest % of mining revenue by far

Only if you're a plug and play miner who will all have a very bad time in the next few months and will shake you out of mining - even with very cheap electricity - until the next big thing comes out for a few weeks, maybe months.

Personally, I don't even remember the last time I mined Ethereum.
legendary
Activity: 1764
Merit: 1024
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.

It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.

Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.

Ethereum continues to give the largest % of mining revenue by far, without it we can all throw our cards away. Also 30W/card would not be negligible.

Only NVIDIA pulls this memory switching shit every generation, your argument fails because AMD and its AIB partners don't do this.

Ethereum is going away this Januaryish as it shifts to PoS.

Only Nvidia switches memory? lol... When I mined with AMD I had three different manufacturers. Hynix, Elpedia, and Samsung from four different card manufacturers. Some changing between the same cards and models. They do it because they find a better partner that can source memory for cheaper or better to them. Mining is a small drop in the bucket when it comes to the entirety of GPUs.

As I mentioned AMD/Nvidia have nothing to do with who the memory is sourced from, only the type that is used (GDDR5/X, HBM). That's all the card manufacturers.

ethereum originally planned to go PoS by summer 2016. then fall. now early 2017.dont hold your breath - IMO the devs and those close to the project are mining it, and advertising a PoS switch "soon" is an excellent deterrent to others from building farms

It's still in the code, it was just moved back a year apparently. So next fall. I was operating off old information. -_-

This just makes Nvidia even more worthless to mine with now. Everything else is still relevant.
legendary
Activity: 2128
Merit: 1005
ASIC Wannabe
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.

It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.

Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.

Ethereum continues to give the largest % of mining revenue by far, without it we can all throw our cards away. Also 30W/card would not be negligible.

Only NVIDIA pulls this memory switching shit every generation, your argument fails because AMD and its AIB partners don't do this.

Ethereum is going away this Januaryish as it shifts to PoS.

Only Nvidia switches memory? lol... When I mined with AMD I had three different manufacturers. Hynix, Elpedia, and Samsung from four different card manufacturers. Some changing between the same cards and models. They do it because they find a better partner that can source memory for cheaper or better to them. Mining is a small drop in the bucket when it comes to the entirety of GPUs.

As I mentioned AMD/Nvidia have nothing to do with who the memory is sourced from, only the type that is used (GDDR5/X, HBM). That's all the card manufacturers.

ethereum originally planned to go PoS by summer 2016. then fall. now early 2017.dont hold your breath - IMO the devs and those close to the project are mining it, and advertising a PoS switch "soon" is an excellent deterrent to others from building farms
legendary
Activity: 1764
Merit: 1024
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.

It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.

Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.

Ethereum continues to give the largest % of mining revenue by far, without it we can all throw our cards away. Also 30W/card would not be negligible.

Only NVIDIA pulls this memory switching shit every generation, your argument fails because AMD and its AIB partners don't do this.

Ethereum is going away this Januaryish as it shifts to PoS.

Only Nvidia switches memory? lol... When I mined with AMD I had three different manufacturers. Hynix, Elpedia, and Samsung from four different card manufacturers. Some changing between the same cards and models. They do it because they find a better partner that can source memory for cheaper or better to them. Mining is a small drop in the bucket when it comes to the entirety of GPUs.

As I mentioned AMD/Nvidia have nothing to do with who the memory is sourced from, only the type that is used (GDDR5/X, HBM). That's all the card manufacturers.
legendary
Activity: 2128
Merit: 1005
ASIC Wannabe
ASIC for ZCAS,ZCOIN ,etc... http://www.ufominers.com/zcash-equinox

dont link that site - total scam

http://www.ufominers.com/ethereum-rhinominer


because everyone knows that an ethereum miner needs a FLOPPY DRIVE
newbie
Activity: 11
Merit: 0
I think 1070 loses efficiency powering the 8GB of memory, the 1060 3GB should be best, except for in ethereum.


Probably, but it's hard to gauge just by how much. I think the number is miniscule and it's nice not to get locked with 3-4GB in the future as I'm not planning on selling 1xxx series cards in less than well over a year.

I could only find estimates on memory power consumption which vary wildly:

http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory (1/3 of the page)

Based on other sources older cards consumed 4.35 W per gigabyte of GDDR5 (17.4W for 4GB and 34.8W for 8GB) but the same source says it's closer to 20W or slightly more for 8GB.

There are also much higher numbers, like 50W for an R9 290X.



isn't it watt per die instead of eg. 1 GB Huh

I don't know. Maybe that's one of the reason why the numbers are so different.

It would be per die... The amount of power memory consumes is pretty small compared to the entirety of the card. Some people also forget that Ethereum is not the only coin in existence and building your ecosystem around one coin that will be getting hit hard in the coming months is not such a great idea.

Nvidia is also not responsible for the kind of memory manufacturers use. That would be up to manufacturers such as MSI/Gigabyte/Asus.

Ethereum continues to give the largest % of mining revenue by far, without it we can all throw our cards away. Also 30W/card would not be negligible.

Only NVIDIA pulls this memory switching shit every generation, your argument fails because AMD and its AIB partners don't do this.
legendary
Activity: 1797
Merit: 1028
980 watt x 900hs at 3300$ ?



.

EIGHT 280X CARDS WILL BEAT THAT--

I have two 4-card 280X rigs that have already paid for themselves at least twice.  They hash at over 700Sols/s per rig.  The wattage for both rigs together is likely more than 980W, I heven't measured.

If they were purchased used, with two 4-PCIe slot motherboards, and PSUs, the total cost would be less than $1600.

I don't see the advantage.       --scryptr

legendary
Activity: 1134
Merit: 1001
980 watt x 900hs at 3300$ ?



.
legendary
Activity: 3164
Merit: 1003
another piece of shit. 3,3k usd for a 900h/s?
Yes and way overpriced IMO. Probably used too. Cheesy
legendary
Activity: 3164
Merit: 1003
ASIC for ZCAS,ZCOIN ,etc... http://www.ufominers.com/zcash-equinox
Thx for posting that. I get the same hash and wattage with what I have.
I bet they could make money on that if they sold it at half price. Smiley I wonder if there used. Grin
legendary
Activity: 3248
Merit: 1072

which is a scam anyway, because with around $1.2k, 4 x 480 plus all the other components) you can get that speed
legendary
Activity: 2716
Merit: 1094
Black Belt Developer
full member
Activity: 304
Merit: 100
ASIC for ZCAS,ZCOIN ,etc... http://www.ufominers.com/zcash-equinox

to whom it needing for that price and power consumption
member
Activity: 84
Merit: 10
another piece of shit. 3,3k usd for a 900h/s?
hero member
Activity: 677
Merit: 500
Jump to: