Pages:
Author

Topic: nVidia a mining option with Kepler? - page 2. (Read 16120 times)

member
Activity: 115
Merit: 10
February 18, 2013, 02:26:31 PM
#19
I tested them when I was given access to the Partner Program nVidia.
Alas, their speed did not impress me - 204 Mh/s
full member
Activity: 196
Merit: 100
February 12, 2013, 07:02:05 PM
#18
And once more, dont get me wrong, its not A BAD CARD. The cards are perfect, way better then Nvidia in same price range, simply have much more raw power. They win all the areas but gaming.
full member
Activity: 196
Merit: 100
February 12, 2013, 06:59:12 PM
#17
Correction, i just installed WoW again, in 13.1 its fixed. Still not as fluent as on gtx660ti, but it works. It took them from 12.6 to 13.1 to fix it. And to think I had doubts about them Cheesy only 6 months. CF still doesnt work good but lets say its not that important. I'll leave the Atis to mine bitcoin, but its not going in my computer ever again for gaming. Imagine spending 600 euro on 2x 7950 6 months ago, and I have to play wow on a single card in directx 9... No thanks. Nvidia - the way its ment to be played
legendary
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
February 12, 2013, 04:58:02 PM
#16


Quote
thats where these RAdeons always fail. Always a patch, always an update...
That's where you fail.

Something against updates? Lol
Thats where they fail. End user wants to put the card into a computer and not to worry. Thats impossible with Ati, they have good cards, but shit drivers, thats why most of gamers go for Nvidia. Not to mention constant lagging in crossfire...
Umm you can totally get an AMD GPU and set-it-and-forget-it. My dad's media PC has an A8 APU, and he never bothers with driver updates or anything like that.

OTOH, I update my AMD GPU drivers whenever they come out, which is about every month or two. And you know what? Nvidia is constantly releasing drivers at a similar pace, so idk what you're bitching about.

You wanna complain about micro stutter in CF? Well things have gotten a lot better since they started using GCN architecture in the 7xxx cards. No multi-GPU setup is perfect (including Nvidia), but it's not bad enough to bitch about.
Nvidia releases drivers but mostly its cosmetic things. AMD (Ati) is not able to fix major issues with games. As far as i know even WoW is still not working perfectly, and its a 10mil player game. It took them 2 months to make it even work in DirectX 11 correctly. There is much more of this crap from AMD... Remember, I am only talking about gaming. For other purposes, it works, but since the majority of 250 euro + buyers are gamers, its a fail. Thats why in that department, nvidia has 80% of the sales in 2012
Weird, i play everything with ATI and it work perfectly. And the latest drivers improved the performance of pretty much everything by a nice amount. On POEM@home ati cards kickassed nvidia from the top.

I don't have all these problems you "report".
And no, nvidia drivers are NOT cosmetic things.
legendary
Activity: 952
Merit: 1000
February 12, 2013, 01:34:29 PM
#15
Nvidia releases drivers but mostly its cosmetic things. AMD (Ati) is not able to fix major issues with games. As far as i know even WoW is still not working perfectly, and its a 10mil player game. It took them 2 months to make it even work in DirectX 11 correctly. There is much more of this crap from AMD... Remember, I am only talking about gaming. For other purposes, it works, but since the majority of 250 euro + buyers are gamers, its a fail. Thats why in that department, nvidia has 80% of the sales in 2012

BS. I want you to look at the full Release Notes for the newest 310.90 drivers. Pages 10 - 19. Nvidia drivers are all cosmetic and performance boosts? No, that's just what the download page highlights, as thats what they want you to think.

Now look at the AMD 13.1 Release Notes. You see the same thing - performance boosts in games are advertised, and then bug fixes. The only difference is Nvidia includes game profiles in the Driver, while AMD offers those as a seperate download (CAPS). Compatibility? My friend has a 7770, and he plays all kinds of games. I seriously think he owns every game Steam has ever offered. He's never run into a compatibility issue because he uses an AMD GPU.

And lastly, you mind posting some numbers to prove your claim that Nvidia outsells AMD 4-to-1 in the 250 Euro and up market? Cuz Techspot claims that AMD GPUs are better in both the $300-400USD and the $400USD and up category.

Classic fanboy slapfest. I love it.  Cool
full member
Activity: 196
Merit: 100
February 12, 2013, 11:31:28 AM
#14


Quote
thats where these RAdeons always fail. Always a patch, always an update...
That's where you fail.

Something against updates? Lol
Thats where they fail. End user wants to put the card into a computer and not to worry. Thats impossible with Ati, they have good cards, but shit drivers, thats why most of gamers go for Nvidia. Not to mention constant lagging in crossfire...
Umm you can totally get an AMD GPU and set-it-and-forget-it. My dad's media PC has an A8 APU, and he never bothers with driver updates or anything like that.

OTOH, I update my AMD GPU drivers whenever they come out, which is about every month or two. And you know what? Nvidia is constantly releasing drivers at a similar pace, so idk what you're bitching about.

You wanna complain about micro stutter in CF? Well things have gotten a lot better since they started using GCN architecture in the 7xxx cards. No multi-GPU setup is perfect (including Nvidia), but it's not bad enough to bitch about.

 Nvidia releases drivers but mostly its cosmetic things. AMD (Ati) is not able to fix major issues with games. As far as i know even WoW is still not working perfectly, and its a 10mil player game. It took them 2 months to make it even work in DirectX 11 correctly. There is much more of this crap from AMD... Remember, I am only talking about gaming. For other purposes, it works, but since the majority of 250 euro + buyers are gamers, its a fail. Thats why in that department, nvidia has 80% of the sales in 2012
hero member
Activity: 507
Merit: 500
February 11, 2013, 10:56:06 PM
#13
Has anyone done any research on the new SHFL commands being added to Kepler?

The new _SHFL commands (to me) seem the same as BIT_ALIGN_INT (32-bit shifting) without having to hit shared memory.

With the upcoming GeForce Titan card supposedly being the same Kepler based GK110 that powers the Tesla K20X at a price point of $899, anyone think that nVidia might be a viable option for mining sooner than later?

http://developer.download.nvidia.com/GTC/PDF/GTC2012/PresentationPDF/S0642-GTC2012-Inside-Kepler.pdf

Yes.... It will hash above 1ghz (in theory).... will you drop the money on it?
legendary
Activity: 2058
Merit: 1431
February 11, 2013, 10:54:59 PM
#12


Quote
thats where these RAdeons always fail. Always a patch, always an update...
That's where you fail.

Something against updates? Lol

 Thats where they fail. End user wants to put the card into a computer and not to worry. Thats impossible with Ati, they have good cards, but shit drivers, thats why most of gamers go for Nvidia. Not to mention constant lagging in crossfire...
hurr durr amd driver suck and make you bluescreen nivida best driver gives you magic unicorns

no seriously, it's AMD now, not ati, and their drivers are pretty good.
legendary
Activity: 952
Merit: 1000
February 11, 2013, 10:21:21 PM
#11


Quote
thats where these RAdeons always fail. Always a patch, always an update...
That's where you fail.

Something against updates? Lol
Thats where they fail. End user wants to put the card into a computer and not to worry. Thats impossible with Ati, they have good cards, but shit drivers, thats why most of gamers go for Nvidia. Not to mention constant lagging in crossfire...
Umm you can totally get an AMD GPU and set-it-and-forget-it. My dad's media PC has an A8 APU, and he never bothers with driver updates or anything like that.

OTOH, I update my AMD GPU drivers whenever they come out, which is about every month or two. And you know what? Nvidia is constantly releasing drivers at a similar pace, so idk what you're bitching about.

You wanna complain about micro stutter in CF? Well things have gotten a lot better since they started using GCN architecture in the 7xxx cards. No multi-GPU setup is perfect (including Nvidia), but it's not bad enough to bitch about.
full member
Activity: 196
Merit: 100
February 11, 2013, 07:31:22 PM
#10


Quote
thats where these RAdeons always fail. Always a patch, always an update...
That's where you fail.

Something against updates? Lol

 Thats where they fail. End user wants to put the card into a computer and not to worry. Thats impossible with Ati, they have good cards, but shit drivers, thats why most of gamers go for Nvidia. Not to mention constant lagging in crossfire...
legendary
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
February 11, 2013, 06:26:03 PM
#9
Quote
thats where these RAdeons always fail. Always a patch, always an update...
That's where you fail.

Something against updates? Lol
full member
Activity: 196
Merit: 100
February 11, 2013, 01:56:49 PM
#8
I don't know much (at all) about ocl kernel performance, but one of the reasons why ATI > Nvidia is the number of cores each GPU has. An ATI 7970 has 2048 cores, and a Nvidia 580 has a quarter of that. Even if it is equivilant to BIT_ALIGN_INT, you'd still only be getting 1/4 of the performance.

Kepler will have 2048 cores in the upcoming GeForce Titan

 Dont think it will be any good for bitcoin, but for gaming? Hell yeah, thats where these RAdeons always fail. Always a patch, always an update...
legendary
Activity: 1484
Merit: 1005
February 10, 2013, 08:06:23 PM
#7
The outright terrible integer compute performance of the GK104 cores which were developed side by side with GK110

http://www.brightsideofnews.com/news/2012/3/22/nvidia-gtx-680-reviewed-a-new-hope.aspx?pageid=4

GK110 is targeted towards the rendering and scientific community, which is FP and DP FP compute heavy.  The main point of GK104 was to provide a card with intensely fast SP FP performance to stomp over AMD with in games (as AMD is a much more well rounded GPU); GK110 now provides the DP FP performance which was totally absent from GK104.
hero member
Activity: 914
Merit: 500
February 10, 2013, 05:26:00 PM
#6
It will still have horrendous integer performance, so no

What do you base this on? It looks like they've been optimizing 32-bit operations with Kepler.
legendary
Activity: 1484
Merit: 1005
February 10, 2013, 05:07:06 PM
#5
It will still have horrendous integer performance, so no
hero member
Activity: 914
Merit: 500
February 10, 2013, 11:09:02 AM
#4
I don't know much (at all) about ocl kernel performance, but one of the reasons why ATI > Nvidia is the number of cores each GPU has. An ATI 7970 has 2048 cores, and a Nvidia 580 has a quarter of that. Even if it is equivilant to BIT_ALIGN_INT, you'd still only be getting 1/4 of the performance.

Kepler will have 2048 cores in the upcoming GeForce Titan
legendary
Activity: 952
Merit: 1000
February 10, 2013, 02:46:07 AM
#3
I don't know much (at all) about ocl kernel performance, but one of the reasons why ATI > Nvidia is the number of cores each GPU has. An ATI 7970 has 2048 cores, and a Nvidia 580 has a quarter of that. Even if it is equivilant to BIT_ALIGN_INT, you'd still only be getting 1/4 of the performance.
legendary
Activity: 1904
Merit: 1002
February 10, 2013, 12:59:37 AM
#2
Has anyone done any research on the new SHFL commands being added to Kepler?

The new _SHFL commands (to me) seem the same as BIT_ALIGN_INT (32-bit shifting) without having to hit shared memory.

With the upcoming GeForce Titan card supposedly being the same Kepler based GK110 that powers the Tesla K20X at a price point of $899, anyone think that nVidia might be a viable option for mining sooner than later?

http://developer.download.nvidia.com/GTC/PDF/GTC2012/PresentationPDF/S0642-GTC2012-Inside-Kepler.pdf

Before long no GPU will be viable unless they include sha256 specific cores.
hero member
Activity: 914
Merit: 500
February 10, 2013, 12:38:37 AM
#1
Has anyone done any research on the new SHFL commands being added to Kepler?

The new _SHFL commands (to me) seem the same as BIT_ALIGN_INT (32-bit shifting) without having to hit shared memory.

With the upcoming GeForce Titan card supposedly being the same Kepler based GK110 that powers the Tesla K20X at a price point of $899, anyone think that nVidia might be a viable option for mining sooner than later?

http://developer.download.nvidia.com/GTC/PDF/GTC2012/PresentationPDF/S0642-GTC2012-Inside-Kepler.pdf
Pages:
Jump to: