Pages:
Author

Topic: WHy do people only buy ATIcards when NVIDA is somuch better? - page 4. (Read 11290 times)

legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
Hey guys no troll intended I just wondered why people use AMD/ATI and now I know. Wink


Everyone want's the little guy to succeed including me, but the hard truth is Intel could buy AMD 10 times over if it was allowed to by the government, that's how far behind technology wise AMD is now... They are a mess, their new Bulldozer chips bench 100mhz faster than 2 year old sandy bridge chips  Roll Eyes

Bulldozer will find it's way in between the i5 2500 and i7 2600 and thus be placed in the upper tier of gaming oriented CPUs so just where AMD aims their products (No you don't need a i7 hexacore to play games). While it's indeed a bit late they made a processor on a new architecture that will lead the way for upcoming generations, not to mention that a new architecture takes time to develop and delays are unavoidable. Also, the FX series overclock like crazy with a pretty low price.

Not sure how that found it's way in in a NVIDIA vs. AMD thread but ye, Intel will probably always be better though they wont beat AMD at price/performance :3


Y'know... I was about to flame the fuck outta you for saying that the bulldozer's were going to "settle in" between the i5-i7 high enders.
But then i kept reading and noticed that you know what the fuck your talking about lol.
If Game/Program coding stays the same. Then Yes, I could totally see the Bulldozer Zambezi chips being a failure, However. They're are two words that always seem to go "haha"
Moores Law.
As it stands.. I do not belive that the bulldozer will "settle" in where you estimate it to, I think it if they get it right... And they use all 8cores properly. Then No Way, is it "weak as or strong as" the Intel i7EX.
I'll admit im an AMD fanboy. (loved the athlonIIx64's vs Intel core2)

Wait where am i going with this.....
Oh. if AMD doesnt do some good coding for thier cpu's Then yeah, it's gonna perform like an i7.
But considering the 16core Instanbul... i think they'll get the coding done just fine.... After a year LOL.
member
Activity: 98
Merit: 10
Hey guys no troll intended I just wondered why people use AMD/ATI and now I know. Wink


Everyone want's the little guy to succeed including me, but the hard truth is Intel could buy AMD 10 times over if it was allowed to by the government, that's how far behind technology wise AMD is now... They are a mess, their new Bulldozer chips bench 100mhz faster than 2 year old sandy bridge chips  Roll Eyes

Bulldozer will find it's way in between the i5 2500 and i7 2600 and thus be placed in the upper tier of gaming oriented CPUs so just where AMD aims their products (No you don't need a i7 hexacore to play games). While it's indeed a bit late they made a processor on a new architecture that will lead the way for upcoming generations, not to mention that a new architecture takes time to develop and delays are unavoidable. Also, the FX series overclock like crazy with a pretty low price.

Not sure how that found it's way in in a NVIDIA vs. AMD thread but ye, Intel will probably always be better though they wont beat AMD at price/performance :3
[/quote]

Well ATI is owned by AMD, Intel would buy Nvidia but anti trust law suits would come as soon as the merger.

Problem is Intel is going to drop the 2500K to $150/$125 when sandy bridge-E X79 comes out, and at that price point it smokes Bulldozer pound for pound. I love AMD, I wish they would be like they were back in the X2 64 days when they pushed Intel hard to compete with them but the way Bulldozer turned out people are going to be losing their jobs  Embarrassed. It's not as bad as some thought, but really they should of done better.

I would never buy first release/gen technology, you always lose out because it takes them 6-12 months to iron out the bugs/bios updates and costs you a premium.
legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
http://en.wikipedia.org/wiki/IW_engine

Not exactly popular outside the CoD line.  Smiley

I'm going to bet taking advantage of something weird in particular game engines that means a real difference between 110 and 333 fps(I take this comparison from the 110 333 mentioned earlier) is a very small niche in the overall gaming graphics market.  But hey, if you'll throw the cash at it, more to ya.

Jack: vsync or not, you're only seeing however many of those frames your monitor is capable of displaying.  If 380fps means you can do funky tricks people with 90fps can't, then there's something screwy with the engine imho(or it's just old enough that, when designed, they didn't even bother considering multiple-hundreds of fps), but when it comes to what you're actually seeing, you're only seeing what vsync would show you.

Point taken. Was about to start going "Dude theres a such thing as Vtears" And went "oh lol, Right,they're thier because of that"
newbie
Activity: 42
Merit: 0
http://en.wikipedia.org/wiki/IW_engine

Not exactly popular outside the CoD line.  Smiley

I'm going to bet taking advantage of something weird in particular game engines that means a real difference between 110 and 333 fps(I take this comparison from the 110 333 mentioned earlier) is a very small niche in the overall gaming graphics market.  But hey, if you'll throw the cash at it, more to ya.

Jack: vsync or not, you're only seeing however many of those frames your monitor is capable of displaying.  If 380fps means you can do funky tricks people with 90fps can't, then there's something screwy with the engine imho(or it's just old enough that, when designed, they didn't even bother considering multiple-hundreds of fps), but when it comes to what you're actually seeing, you're only seeing what vsync would show you.
hero member
Activity: 770
Merit: 502
Who said anything about having Vsync actually enabled?, Anyone with 61+fps usually has vsnc off

Fuck lol im hitting 380fps avrg of CS:S Maxxxed out

Out of all the time I've heard of counter strike, I've actually never played it. Is it really good?
legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
Who said anything about having Vsync actually enabled?, Anyone with 61+fps usually has vsnc off

Fuck lol im hitting 380fps avrg of CS:S Maxxxed out
hero member
Activity: 770
Merit: 502
I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

So, what monitor are you using where fps that high actually matters?

And don't get me started on what the human brain is actually able to process...

I'm using a 17" monitor. This is not about what I can see. I clearly said in my post to take advantages of the games I play.

It seems on some game (like quake 3 and maybe some cod) if you have like 333 fps you can do more things, jump more or something like that.....

Gabi has it correct, with 125 or 333 FPS, you can jump higher, get onto ledges, boxes, taken advantage of the games.

This isn't console, this is PC Gaming at the best.

Imaginary land.

Funny bashing a card for "only" getting 110fps when virtually all LCD only display 60 frames per second anyways.  Very few are 72 or 75Hz even then 100fps is more than enough to drive them to their limit.  Remember an LCD is a physical device it does take some time to physically pivot the liquid crystal (via electrical impulse) and alter how much light that is emitted.  Doesn't matter how many fps a videocard can create in memory it still takes time for the actual crystal to align.

Well, that will be your little secret.

Yes but it's not about the monitor, it's about game physics glitches and fps.

Gabi is correct once again. Smiley

If for some reason a game has a difference in how it works between 110fps and 333fps, there is some sort of crazy programming going on in the background, unless the difference is being caused by the CPU and not the GPU, or it's PhysX compatibility problems among the ATI/AMD cards.

[edit] One last thing, did crossfire even -exist- when Q3 was released?  That's also the same graphics engine used in CoD1.  The reason I ask is the person I originally replied to was comparing a 6600 nvidia to crossfire 5830s in CoD1.

Though a single 5830 should still smoke a 6600 nvidia unless the Q3 engine was designed with Nvidia specs in mind (which they did for Quake 2 and 3dfx - my voodoo 3 outperformed much faster GeForces in Quake 2 for years - so I wouldn't be surprised if they did the same for Quake 3 and Nvidia).

The game might be designed like that, but I've always felt ATI/AMD locked something down from getting high FPS from burning up the card. This was also speculated towards ATI/AMD while using Furmark benchmark program.

But all in all, these little fps tricks work for COD1 & United offensive, COD2, COD4, CODWAW, BO, and most likely what ever other game that used the same gaming engine.

Edit:
On another note:
GTX 480/490 GTX 580/580 x2/590 would throw cod games over 1000 FPS, and if PB is enabled, you will get kicked for it, but if PB is disabled, there will be no problem. AMD 6990 would probably get 250 fps in cod games.
newbie
Activity: 42
Merit: 0
If for some reason a game has a difference in how it works between 110fps and 333fps, there is some sort of crazy programming going on in the background, unless the difference is being caused by the CPU and not the GPU, or it's PhysX compatibility problems among the ATI/AMD cards.

[edit] One last thing, did crossfire even -exist- when Q3 was released?  That's also the same graphics engine used in CoD1.  The reason I ask is the person I originally replied to was comparing a 6600 nvidia to crossfire 5830s in CoD1.

Though a single 5830 should still smoke a 6600 nvidia unless the Q3 engine was designed with Nvidia specs in mind (which they did for Quake 2 and 3dfx - my voodoo 3 outperformed much faster GeForces in Quake 2 for years - so I wouldn't be surprised if they did the same for Quake 3 and Nvidia).
legendary
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
Yes but it's not about the monitor, it's about game physics glitches and fps.
donator
Activity: 1218
Merit: 1079
Gerald Davis
Imaginary land.

Funny bashing a card for "only" getting 110fps when virtually all LCD only display 60 frames per second anyways.  Very few are 72 or 75Hz even then 100fps is more than enough to drive them to their limit.  Remember an LCD is a physical device it does take some time to physically pivot the liquid crystal (via electrical impulse) and alter how much light that is emitted.  Doesn't matter how many fps a videocard can create in memory it still takes time for the actual crystal to align.
legendary
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
It seems on some game (like quake 3 and maybe some cod) if you have like 333 fps you can do more things, jump more or something like that.....
newbie
Activity: 42
Merit: 0
I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

So, what monitor are you using where fps that high actually matters?

And don't get me started on what the human brain is actually able to process...
legendary
Activity: 1148
Merit: 1008
If you want to walk on water, get out of the boat
I had an ATI 1900, no problems.

Then had an ATI 3870, no problems.

Now i have an ATI 6950, once more no problems at all.

Everything work fine. All games, everything.
hero member
Activity: 770
Merit: 502
Just a noob question here:
Have you gone into the CCC and:
Set all to application controlled.
Turn OFF AMD optimised surface format for textures and tessalation
(vSyncers use this) Turn on OpenGL triple buffering

I've tried every setting possibly could be done.

This is where I found the fix for the cod4 freezing post #7

http://forums.steampowered.com/forums/showthread.php?s=62de95079276636814aa65273ec2a093&p=21474117#post21474117

Post #7 was the fix for cod4, nothing can be done for waw. There are a ton of threads going around about 5830's crossfirex and games freezing, and amd hasn't, from what i see, hasn't done a thing about it.
http://forums.steampowered.com/forums/showpost.php?s=62de95079276636814aa65273ec2a093&p=21474117&postcount=7
legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
Shit man, not any fan here but, I have only 4 or 5 pc games, and one of my favorites I like to play, I cannot even play in crossfirex, Cod world at war, freezes up on map load, not loading map, when the map loads, the game freezes, cod4 did the same, found a fix for it, gotta have x16AA enabled which is bs, drops my frame rates down significantly as if i was using just one 5830 because I've gotta have AA enabled @ 16 to play, whats the point of crossfirex if I gotta enable this, btw this little fix don't work for WAW. I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

I pick ATI couple years ago because of price/performance, then a 5830 ended up in my hands, led to mining, led to having two 5830's.

I had a nvidia 6600 couples years back, and that 6600 still performed better on one of the oldest games I've got, cod1 than even using 2 5830's.

It's a choice of preference in what card you want or have expenses you have to get what you want.

Ionno, there is just something about ATI/AMD cards, I don't like using for gaming, somehow frame rates get locked. Nvidia can/could release extra frame rates for the games I play.

This is just my opinion, and please don't bash me for my opinion Smiley.

I'll stick with ATI/AMD for bitcoin mining, once I am able to get a hold of a high end nvidia card, I'm going for it, strictly for PC Gaming.

Just a noob question here:
Have you gone into the CCC and:
Set all to application controlled.
Turn OFF AMD optimised surface format for textures and tessalation
(vSyncers use this) Turn on OpenGL triple buffering
hero member
Activity: 770
Merit: 502
Shit man, not any fan here but, I have only 4 or 5 pc games, and one of my favorites I like to play, I cannot even play in crossfirex, Cod world at war, freezes up on map load, not loading map, when the map loads, the game freezes, cod4 did the same, found a fix for it, gotta have x16AA enabled which is bs, drops my frame rates down significantly as if i was using just one 5830 because I've gotta have AA enabled @ 16 to play, whats the point of crossfirex if I gotta enable this, btw this little fix don't work for WAW. I don't really care about looks of the game, I want the highest FPS I can get, 333 or more, which takes advantage of the game, my two 5830's cannot even stay above 110 in most of the games I've got or cannot play at all.

I pick ATI couple years ago because of price/performance, then a 5830 ended up in my hands, led to mining, led to having two 5830's.

I had a nvidia 6600 couples years back, and that 6600 still performed better on one of the oldest games I've got, cod1 than even using 2 5830's.

It's a choice of preference in what card you want or have expenses you have to get what you want.

Ionno, there is just something about ATI/AMD cards, I don't like using for gaming, somehow frame rates get locked. Nvidia can/could release extra frame rates for the games I play.

This is just my opinion, and please don't bash me for my opinion Smiley.

I'll stick with ATI/AMD for bitcoin mining, once I am able to get a hold of a high end nvidia card, I'm going for it, strictly for PC Gaming.
hero member
Activity: 914
Merit: 500
I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

I smell a troll.  I don't see why anyone would even post something like this when even a few minutes of reading on the boards/research would clearly show ATI outperforms NVidia in mining.  Seriously people, spend a few minutes looking around before you post.

+1
sr. member
Activity: 378
Merit: 250
I don't get this, is it price? I thought people would be investing in NVIDIA 580GTX's rather than ATI.

What gives?

I smell a troll.  I don't see why anyone would even post something like this when even a few minutes of reading on the boards/research would clearly show ATI outperforms NVidia in mining.  Seriously people, spend a few minutes looking around before you post.
legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
And btw, i had an ATI even before discovering bitcoin, so uh, nvidia is not better even outside of bitcoin

It blatantly is. Nvidia is best for driver support ( ATI has shittiest drivers ever - 100% CPU bug STILL present now ), gaming performance and general stability and usage as normal GPU. Only get ATI if you want to compute. Avoid otherwise.

Is that a fucking troll?
Lets see you use OpenCL---OH WAIT YOU FUCKING CANT.
Okay how about eye-- SHIT YOU CANT DO THAT EITHER
What about 3D gami---- OH YOUR GAMES NEED TO BE PROGRAMMED FOR IT?
Okay.. Lets stop bashing you.. Can your $500 GTX580 get >200Mh/sec? NOU? IT CANT? Thats funny, My $120 5830 gets 306mh/sec
Sooo lets see here... Nvidia is better how?
A Shader clock? AKA How Nvidia is rigging the market by putting All new graphics tech on the "shader" clock. Cool.
Can you do Tessalation? NOU? COOLSTORYBRO.

Nvidia is Not. IMO, In ANYWAY SHAPE OR FORM, Better Than AMD cards.
Heres what Nvidia is doing.
SSAO=Shader clock
AO=Shader clock
Shadows=Shader clock
Bumpmapping=Shaderclock
Lighting effects=Shaderclock
Reflections=Shaderclock
AA=Coreclock
Textures=Memclock
Engine=Coreclock
Physics=Core+Mem+Shader


Thier prices are 2x what they should be, And all this Bullshit about drivers? Are you FUCKING kidding me. Last time i had a driver issue with ATI, Was went i went crossfire, And i was Fucking 14, And What did i do to solve it? Uninstall Reinstall,
OH THATS SUCH A BUG?!!!! LETS FLAME ATI!.

If you lived nearby me I would fucking smack you for not knowing what your talking about.

They are programming all modern games to use the shader clock, A clock that NO amd card has.

Think about it. If AMD WAS ALLOWED (WICH NVIDIA HAS PUT A BLANKET LAWSUIT POLICY OVER DOING THIS) To Put a Shader Clock, Into thier cards, GOODBYE nvidia.
Nvidia has label'd Shader Clock's "Technolgical Property of Nvidia", Effectively Banning AMD from using a Shader clock

And with Nvidia writing the code for all the new games to use the SHADER CLOCK LIKE A FUCKING WHORE, It makes ATI cards Seem to perform worse.

They will perform Terrible Bad on a Nvidia Flagship game. No doubt, No denial.
(old) Dual 5770xfire= TWENTY fps on NFSShift, NO MATTER THE GRAHPICS SETTINGS. Max and Nuked gave same frame rates.
But Dual 5770's on Dirt2= Max graphics(less the crowd set to low)= Frame Rate NEVER dropped below !50!

Now you tell me wich of those video games Graphically looks better.

It's All About The Method Of Testing Performance
And Heres A Fuckton of methods
http://alienbabeltech.com/main/introducing-the-worlds-fastest-graphics-card-amds-flagship-hd-6990/all/1

NOT TO MENTION, THAT THE RADEON 6990 IS THE WORLDS FASTEST GPU. ONLY NVIDA FANBOYS WOULD SAY OTHERWISE

And before you drabble at me about the Nvidia's calims of they're new card being faster, Thats if you've got TWO of them VS One 6990
donator
Activity: 1218
Merit: 1079
Gerald Davis
So a natural follow up to this discussion would be:

What sort of hash algorithm could be relatively platform neutral?  (This would be kind of ideal in honesty so hardware platform is not as much of a concern and free market pricing and competition work more smoothly)

OR - What sort of hash algorithm could be pro-Nvidia architecture?

It would be hard I suspect to get Bitcoin to ammend their hash algorithm with something like this BUT if it were to ever happen it might be good to have an idea of some options...

My understanding is that Nvidia underperforms AMD in all current cryptographic hashing algorithms.  Hackers and password crackers all over the world use AMD GPU exclusively.  Whitepixel for example is an open source MD5 crackers which has vastly superior performance on AMD GPU (roughly 400% higher throughput when normalized for price).

There are a couple reasons for this
1) Prior to Fermi GPU Nvidia chips lacked 32bit integers internally.  This vastly slows down computations on 32bit numbers.  Given 32bit int is an industry standard for CPU architecture I don't know of any cryptographic hash which doesn't use 32 or 64 bit numbers internally.

2) Nvidia GPU lack certain instructions that allow hashing to be completed in less steps.  There is no reason Nvidia couldn't add these fast operators in the future but until now cryptographic (and integer performance in general) wasn't an important metric.

3) Nvidia architecture is based on the concept of fewer but more powerful shaders where AMD is based on the concept of more but simpler shaders.  AMD architecture simply fits better with concept of hashing where multiple simple operators are performed on an operand.

I don't think AMD designed their GPU to be good at cryptography.  They simply happened to be more efficient at cryptographic functions than NVidia's GPUs are.  My guess is now that the market has seen what performance can be gained by using GPU for integer operations and as GPGPU becomes more common the market will demand Nvidia have better integer performance and better cryptography efficiency.   Some futue Nvidia GPU will likely overcome the current shortcoming of the current architecture.
Pages:
Jump to: