Pages:
Author

Topic: WHy do people only buy ATIcards when NVIDA is somuch better? - page 2. (Read 11290 times)

donator
Activity: 1218
Merit: 1079
Gerald Davis
Okay seriously, It's not an "incredible hoop" to do this to fix "powerplay":
Run
Regedit
Ctrl+F->ULPS
Set all Enable_ulps(1)'s to 0's

However that merely disables low voltage mode.  Which is fine for 24/7 mining rigs but the entire point of ULPS was to reduce heat, noise, and power consumed when card is at idle.  If the solution is to turn it off that doesn't reflect well on AMD.
legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
Okay seriously, It's not an "incredible hoop" to do this to fix "powerplay":
Run
Regedit
Ctrl+F->ULPS
Set all Enable_ulps(1)'s to 0's
hero member
Activity: 774
Merit: 500
Lazy Lurker Reads Alot
Exactly n4l3hp
I have been running boinc and boinc like projects for more then a decade and still some ati run without any issue
Funny enough i have also one DFI nfi 4 expert sli which still is alive with a opteron 175 on it which is only used to run on milkway before i had enough of boinc for now. I think i sponsored them a pretty house by now so time to move on.

I must say the intel boards and compaqs are pretty sturdy as well, a few still running with an p2 and p3 on it in some dark corners of a factory hall and doing their work despite tons of dust piled up in it.
I have cleaned them a couple of times but lol those beasts keep running without a hitch Cheesy
That makes me wonder if the old ati cards will do the same Cheesy, the oldest is a ati radeon ve with 16 Mb which is still being used by a game junk for running older games which do no longer run on these super fast new cards

 
   
full member
Activity: 173
Merit: 100
lol
my 2 cents on this is easy

NIVIDIA suxx big time i have had the most dying cards from nvidia, the worst drivers, and is the biggest scammers with endless rebranding the same product.
Now even today after 4 months of use another card died and again a nvidia crap. So the score this far NVIDIA 5 out of 8 died, dead, kaput, gone
ATI only 1 really died out of 27 true 1 other has been replaced but it was still working even with 110 c temps.
and yes when you use these cards when overclocking in time they will slow down, but then again you wanted to overclock and in most cases they will not die completely.
This far all the cards except the dead one, are still working but not overclocked by family and friends and all are happy with my old cards.
Yes ati needs to put some more money on the driver design which in my view will pay off big time, but i do favor any ati above all nvidia only on the low budget cards i say it does not matter which you buy.

Sure nvidia works on a few games better on their product but YOU PEOPLE must understand those games are totally made for these cards and the makers make sure ati will never run better then the paying big time scammer nvidia.
Yes nvidia pays them a lot of money for keep their product fastest, in products where no cards are favored by the secret donations ( or whatever you wanna call the payements made by nvidia )  you see a totally different score.
Now yes some games will benefit from one or the other but to call ATI crap is way too stupid the parts ati uses are way better quality as nvidia is doing, hence the nice cheap capacitors who blew up. ATI has been using the best japanese ones as far as i know. And again has a way lower dying rate

So to end this discussion NVIDIA sells crap period.


+1

Before I got into BOINC and then Bitcoin, I didn't care what card I bought as long as its readily available at my local computer store and I can afford it. Over the years, I cant count how many I've bought and sold. Both computers of my two sons used to have NVIDIA cards for gaming, while I used ATI/AMD cards on my personal rigs that were running BOINC and now mining BTC.

Guess what, all NVIDIA cards died (only used for gaming on stock settings) while my 3850's and 4850's are still alive and crunching BOINC (all OC'ed and been running 24/7 for a few years) and the 6870's are mining BTC without hiccups.

Same also for the motherboards, all that had nvidia chipsets died usually a few months after the warranty expired (used to operate an internet cafe business until last year) except for my old trusty Epox nForce 4 Ultra (with a dual core socket 939 athlon 64) that found a new home inside my wife's computer and a 5670 attached to it.
hero member
Activity: 774
Merit: 500
Lazy Lurker Reads Alot
Well lol i can't resist answer again i found when i was gaming that all the games who start with made for NVIDIA do have a problem with powerplay, i wonder if any other game which i have not played does it.
So far all had that crap green logo and ofcourse i have not played all games and never will but, and yes powerplay can be addressed in by using the bios editor and turn it completely off.
Now to be honest i do not think you like that if you do not use it on a dedicated miner
For all those who like me do more things then mining on their pc swtiching to lower power consumption does lower the huge bill
and we like that even though it can be a pain in the ass XD
Yes the solutions showed by JackRabitt worked wonders for me too, i actually still use some of them if needed.
I would like to see those companies release the driver to open source because i know there are a lot of wizards who are much better then the ones working at those companies.
Remember omega drivers ... not then your really not from this world those where awesome
Many of these guys made failing drivers from either company work like they should
Sadly they all stopped, most because they lost their jobs or completely disappeared, that is the issue with open source but i am certain people would come back in when they could get some donations from the people using it.
So for now you are stuck with the programmers from ati and nvidia who need alot of time to fix some issues like the crossfire problem which took ages xD. Now i do not dare to say they suxx but lol sometimes when a fix is done in previous version in the next you get it back, and yes on both brands
I still say AMD has to invest more into driver programmers because it will pay off >.<

sr. member
Activity: 406
Merit: 250
QUIFAS EXCHANGE

Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I lold when I read this. If anyone has experienced this, they'll lol too. So true... So annoying... Nothing worse than watch your 2nd monitor jump like a fucking rabbit & with tear lines in the screen. Common sense, ATI/AMD, powerplay is broke as a bitch, I guess it never came to the minds of ati/amd to do testing before releasing technology. *Insert Facepalm Here*

I actually found a way to deal with this due to mining. I had this issue on some of the lower end cards starting at 5450 - 5830. I found that if I ran cg miner with the cards disabled for mining, but set the clocks before hand, I could keep it from switching. This seem to stop after ver 2.0.3
hero member
Activity: 518
Merit: 500
nVidia disables their variant of powerplay when you attach a second monitor. People bitch about high idle temps with two monitors. I guess rightly so, but it sure beats the unbearable screen tearing you have on AMD and the incredible hoops you have to jump through to try and disable powerplay. In the end I gave up and just used MSI afterburner to fix the clocks and have dual monitor be useful. Kinda ironic how AMD markets their cards for 6 way eyefinty but cant seem to make 2 monitors work.
hero member
Activity: 770
Merit: 502

Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!

I lold when I read this. If anyone has experienced this, they'll lol too. So true... So annoying... Nothing worse than watch your 2nd monitor jump like a fucking rabbit & with tear lines in the screen. Common sense, ATI/AMD, powerplay is broke as a bitch, I guess it never came to the minds of ati/amd to do testing before releasing technology. *Insert Facepalm Here*
legendary
Activity: 1178
Merit: 1014
Hodling since 2011.®
Go NVIDIA make that opencl fly I dare you! Smiley
hero member
Activity: 518
Merit: 500

Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while...

Thats not the worst part. Attach a second monitor and see what happens!
legendary
Activity: 882
Merit: 1000
lol@this thread.

Okay so someone honestly didn't know or they are trolling, but why say "How come because nvidisa is SO MUCH BETTER"? Poppycock.

I have used NIVIDIA GeForce 240s here in Korea to mine and they get up to about 30Mh/s with practical no additional power. That is their plus. If you have a botnet, Geforce is totally practical and cost effective.

If you're condensing and want a single rig, Nvidia is retarded.

Wait a second, a botnet? Are you sure you are using that term correctly?
full member
Activity: 136
Merit: 100
Just. Right off the bat. (AMD fanboy here) What the Fuck is Powerplay?, And dont tell me to fucking google it, I want YOU to tell me what it does. Because i've never heard of it

Media buzzword for a function where the card switches gpu speeds on the fly for idle mode, video-only mode, etc. So it idles at like 150mhz, plays videos at 400mhz, runs games at 800mhz, crap like that.

Problem is that it automatically switches according to load, and some apps may only trigger the switch to high speed modes once they already stuttered for a while... and when running at full speed, one or two game may have some low-complexity scene, that takes less power to render, and so the card switches back to idle mode mid-game, and only switches back after some stuttering again.
This behavior may be optimized per-game though, I've only seen it happen in some emulators which don't exactly get support from the driver team. And in windowed mode too.

However, it's still better than how nvidia cards idle at 60c and burn the fuck down if the cooling fan doesn't run for a moment. And no, Radeons don't burn if the fan gets stopped either. When I was testing my 5850 with a passive Accelero heatsink (no fan), the card hit 130c and then instantly halved its own speed so temps can drop and the thing doesn't melt itself on the spot.

Now, I've been owning Ati cards for a long time now, and I agree that the drivers have several retarded issues. But to say that nvidia has better drivers, that's just nvidia payed fanboy ranting nowadays. And a lot of the issues come out of the fact that the typical gamer has an average of 96 processes going on his PC at the same time.
sr. member
Activity: 280
Merit: 250
Firstbits: 12pqwk
Before I found out about bitcoin I bought a Nvidia GTX 570, a sweet solid GPU with 3D vision support.

After bitcoin, I traded that 570 for a XFX 5850 + 5870, was a sweet trade at the time Smiley
legendary
Activity: 1428
Merit: 1001
Okey Dokey Lokey
Tearing, stutters and screen flickers really.

Yup, same here, sick of the bs. Don't know how it works for nvidia, but ati's powerplay is fucking stupid, that causes screen flickering. Only way to fix/disable that shit, use MSIAfterburner or hack into the ati driver and set something up to disable it. But, gawd lordy, I hope nvidia don't have that shit or nvidia gives end user the choice to easily disable PowerPlay.

There are a whole lot of people that agree on how stupid powerplay is with ati/amd cards.

Btw, sapphire trixx programmer seems not to care to implement to disable powerplay like msiafterburner has.

Just. Right off the bat. (AMD fanboy here) What the Fuck is Powerplay?, And dont tell me to fucking google it, I want YOU to tell me what it does. Because i've never heard of it
Tearing? I'll just assume that your Not talking about screen Vtears. And Well. Cant argue about that. Somegames just Fuckup on certain ATi drivers and it's annoying as hell.
Stutters? Thats the Easy one to fix, Go into the CCC Take off AMD Optimised tessalation Alswell as Surface Format Optimisation. These options are for crappy cards. and cause stuttering on high end ones (im running xfire XFX6870BEdualfan's and i was stuttering like a Whore on Crack before i turn this shit off) Then set the rest to "application controlled"

Screen flickers.... That was a Crossfire bug.. I had that on Crysis2 for a 'lilwhile but then with a driver update it vanished..(dx11HiResAdvanced)
And i could stop the screen flickers but turning on Vsync.
sr. member
Activity: 420
Merit: 250
I wasn't talking about the driver, i was talking about the hardware.. GMA 3000 is their best yet, but it barely compares to the AMD APUs.
hero member
Activity: 518
Merit: 500
Kind of my point. If intel, with all their might and even with the help of the OSS community cant make half baked linux drivers (not sure why you exclude sandy bridge btw, as thats a complete trainwreck on linux) for their relatively simple hardware, I wouldnt hold my breath for the OSS community to outengineer nVidia in this regard, particularly not without full unrestricted access to all the specs, and having those specs years before release of hardware like internal driver teams of AMD and nvidia have.

Now I do agree over the past years, AMD have made remarkable progress, particularly for windows gaming drivers, but the gap with nvidia is still huge on linux (and with nvidia's new focus on tegra and linux based android, I dont expect AMD to close that gap anytime soon).

Anyway, for me its incredibly simple; for bitcoin mining obviously there is only choice. For windows gaming, either is good, with AMD generally having a price/$ advantage. For Linux and most professional apps, nVidia is the obvious choice.
sr. member
Activity: 420
Merit: 250
I suppose, but then again, intel it self, in the GPU department was completely terrible until sandybridge, and still don't match up with dedicated components, and BARELY break even against AMD's APU shit.
hero member
Activity: 518
Merit: 500
since ATI was eaten by AMD.. AMD has released and opensourced much of the drivers..

No they havent. They have released partial specs for older cards so the community has been able to build usable drivers. Not great drivers, but usable. Well, if you dont game that is.

Quote
and look like they are moving toward releasing them all. This will make AMD's drivers FAR better than nvidias in the long run.
with open specs, open drivers, and hundreds of thousands of eyes looking at the code, they will get fixed and working much quicker.... well, if you're on linux Tongue but it will roll over to windows too.

Ive heard nothing of AMD (or nvidia) planning to open up their proprietary drivers. Even so, much as I am a OSS fan, creating good 3D video drivers is no easy task and requires in depth knowledge of the underlying hardware. I wouldnt expect miracles from opensource here. Just look at intel GPU drivers; they are opensource, and have been for ages, but they still utterly and completely suck. Lets not mention VIA Chrome drivers.  Love m or hate m, nVidia is head and shoulders above the competition when it comes to Linux drivers.


sr. member
Activity: 420
Merit: 250
since ATI was eaten by AMD.. AMD has released and opensourced much of the drivers.. and look like they are moving toward releasing them all. This will make AMD's drivers FAR better than nvidias in the long run.
with open specs, open drivers, and hundreds of thousands of eyes looking at the code, they will get fixed and working much quicker.... well, if you're on linux Tongue but it will roll over to windows too.
member
Activity: 112
Merit: 10
The idea is novel, but implementation is horse shit.

ATI is working from behind though (and doing well on the whole) I got a 5790 for 310 off ebay thats still WELL worth the price.  Nvidias OCD style attention to detail got a shitload of my money for years.  AMD is doing a half decent job of catching up, but Nvidia (much like Intel recently) has done a spectacular job of keeping their rendering on screen limited to that which is smooth rather than solely that which is fast.
Pages:
Jump to: