open All Channels
seplocked Out of Pod Experience
blankseplocked Vent About Computers, GPU's, And My Own stupidity
 
This thread is older than 90 days and has been locked due to inactivity.


 
Pages: 1 [2]

Author Topic

Rashmika Sky
Amarr
R. Sky Escorts
Posted - 2011.08.18 22:31:00 - [31]
 

Originally by: Astenion
Edited by: Astenion on 17/08/2011 23:57:04
Edited by: Astenion on 17/08/2011 23:45:13
And to be honest, I'd return that GPU and get anything nVidia. ATI has TERRIBLE drivers and their cards burn out after about two years. They are faster, however, but they're faster because they're made to run hotter than others. However, the hotter they run, the faster they're going to wear out. Just something to keep in mind. Oh, and by "faster" I mean they can render faster than some of nVidia's cards, but it doesn't really matter since said rendering will be invisible to the naked eye. Basically you're buying geek specs that no one cares about, and when your card burns out, your buddy will still be playing on his nVidia.


Mostly everything Astenion has posted here has been good, even great advice, but I'm not so sure about this. From personal experience, every GPU I've had has eventually died on me. ATI, Nvidia, they've all ended up the same. On average the cards lasted 3 years or longer, and I haven't noticed any advantages of one brand over the other in terms of reliability. Remember also, in 2-3 years, the new graphics hardware will outclass the old hardware, so unless you have no interest in new games or really need/want to avoid upgrading your GPU for financial reasons, I suggest not worrying too much about that; though I have used each of my cards until they died, I used make little enough money that I didn't like spending it needlessly, and if a game could get a consistent 25+ fps on the lowest quality, lowest resolution (turning off GPU scaling to keep a 1:1 pixel ratio for a sharp, but small, image), most performance tweaked settings, it wasn't time to upgrade - now I would feel different about that, if there were any games worth upgrading for, anyway.

So far as drivers go, ATI does have a reputation for bad drivers; however, I believe that is a holdover from their drivers back in the 90s. At any rate, I haven't experienced anything to convince me that one brand has better drivers than the other; perhaps on Linux it's different, but even there it seems ATI has been making headway the past several years, and I personally haven't had issues on that front.

On the subject of software, there are an awful lot of games with the Nvidia logo popping up at load time; if such a game *did* run better on Nvidia cards (and it isn't unheard of for ATI to do better, btw), who's to say that it has anything to do with driver quality, as opposed to developers putting more effort into their Nvidia support for various reasons? However, that in itself could be considered a reason to buy Nvidia, if you think developers are going to provide better support for that platform.

I haven't been in the market for a GPU recently, but historically, I believe ATI has provided the better price:performance ratio. I think for the most part, Nvidia has had the faster cards - at the high end - but ATI has had the faster cards at the sensible price ranges.

Lest I come across as an ATI fan or an Nvidia hater, let me point out that I have, oddly enough, alternated brands each time I replaced a GPU; not through any bad feelings towards the previous brand, but due to which cards were the best available for my needs at the time. I currently have an ATI 5850, which I've had no problems with - and let me add, for the heck of it, that when that was current, Nvidia's new cards were considered toasters and power hogs (the first Fermi architecture); so do your research on whatever GPU you buy, brand name doesn't determine how hot a GPU is.

I expect my current GPU will last me at least another year or two, unless I decide to replace it early; though with PC gaming in its present state, who needs a cutting edge graphics card anyway? Just buy a console, that's where all the games originate nowadays, and, oh so clearly considering the developers/publishers putting more effort into DRM and selling content to fill empty games than they do in making a good port, that's "The way it's meant to be played." Yes, I'm bitter - but not about my GPU. Nvidia and ATI have both put out great cards over the decades; that's why they're both still around (even if ATI is part of AMD now).


Astenion
Gallente
Spiritus Draconis
Posted - 2011.08.19 00:27:00 - [32]
 

Don't get me wrong, ATI makes fantastic cards...it's just that due to my past experiences and other friends' experiences with ATI, I never trust them. There's always something that goes wrong with them, even with vanilla settings. I don't overclock...I simply install the card, install the drivers, and go. nVidia has always pulled through for me and the shortest lifespan of an nVidia card was my last one which died last year at the ripe old age of 4 years.

It could be that my friends and I have all just had runs of bad luck with ATI, which is possible, but it's been enough to never want to buy another Radeon when nVidia has always come through for me.

Herping yourDerp
Posted - 2011.08.19 02:04:00 - [33]
 

dont upgrade stock computers (other then ram). your its a pain in the ass, they use the lowest rated PSU's possible so a new gfx card or cpu will not work right.

Akita T
Caldari Navy Volunteer Task Force
Posted - 2011.08.19 05:32:00 - [34]
 

I've personally had - as far as I can remember - one 3DFX card (back then they were separate from the video cards, called "accelerator" cards ; it was a VooDoo2, I think purchased in autumn of 1998), two ATI cards and four NVIDIA cards.
Two of them died on me, the rest just got "hopelessly morally obsolete".
Those that croaked were both NVIDIA Twisted Evil Ok, they didn't exactly die, they still kind of work, but not properly.
Funny enough, the ATI ones were purchased as very cheap temporary replacements for the NVIDIA ones that went bonkers, and they're still in use in the respective machines (granted, the current users seldom play 3D accelerated games, so, meh).

One was a NVIDIA GeForce MX2 (yes, that was a card from almost a decade ago), the thermal compound linking the GPU chip to the heatsink *carbonized* eventually and stopped conducting heat properly, could not properly remove it for fear of damaging the chip.
Card still works (even accelerated 3D), but leaves ugly horizontal "shadows" all across the screen from any high-contrast edges.
Happened after about 3 years of use.

The other was a NVIDIA GeForce 8500 GT (passively cooled) which had the ugly tendency to peak at well over 100C in the summer (machine was purchased late autumn).
That one still works mostly fine in 2D when cool(-ish), but once it heats up, it goes bonkers (and starting any 3D app makes it heat up).
Happened after about 2 years of use (or two and a half).

Something Random
Gallente
The Barrow Boys
Posted - 2011.08.19 10:19:00 - [35]
 

There was a 'known throughout techieland but not admitted to by nVidia' heat issue with 8000 series chips and GPU's, it didnt effect all of them though. I had a 8800GTS 512 that is a happy beast still - and i maintain it has one of the nicest heat exhaust solutions to date too. BFG makers.

This thread is a tale of woes... and it begins with the 2 simple characters H and P. Others you do well to avoid because of some and all of the same 'tricks' are Packard Bell, Dell, Advent and Medion. In laptop world 'you makes your choices' as they say. If you cant or simply dont need to build your own, buy from a good builder - MESH, Chillblast, the Zoo folks... THAT kind of builder.

Very Happy

Astenion
Gallente
Spiritus Draconis
Posted - 2011.08.19 12:44:00 - [36]
 

Originally by: Akita T
I've personally had - as far as I can remember - one 3DFX card (back then they were separate from the video cards, called "accelerator" cards ; it was a VooDoo2, I think purchased in autumn of 1998), two ATI cards and four NVIDIA cards.
Two of them died on me, the rest just got "hopelessly morally obsolete".
Those that croaked were both NVIDIA Twisted Evil Ok, they didn't exactly die, they still kind of work, but not properly.
Funny enough, the ATI ones were purchased as very cheap temporary replacements for the NVIDIA ones that went bonkers, and they're still in use in the respective machines (granted, the current users seldom play 3D accelerated games, so, meh).

One was a NVIDIA GeForce MX2 (yes, that was a card from almost a decade ago), the thermal compound linking the GPU chip to the heatsink *carbonized* eventually and stopped conducting heat properly, could not properly remove it for fear of damaging the chip.
Card still works (even accelerated 3D), but leaves ugly horizontal "shadows" all across the screen from any high-contrast edges.
Happened after about 3 years of use.

The other was a NVIDIA GeForce 8500 GT (passively cooled) which had the ugly tendency to peak at well over 100C in the summer (machine was purchased late autumn).
That one still works mostly fine in 2D when cool(-ish), but once it heats up, it goes bonkers (and starting any 3D app makes it heat up).
Happened after about 2 years of use (or two and a half).



My only card from nVidia that died was the GeForce 8800 GTS 512...after 4 years of extremely heavy gaming. That card was ****ing awesome...it was nearly as fast as a 1gb card.

Nite Piper
Posted - 2011.08.22 20:59:00 - [37]
 

Edited by: Nite Piper on 22/08/2011 21:14:45
Seems I also feel a need to comment on the ATI bashing here:

Comments like those (about anything) are very common and typical in hardware discussions. Because a lot of people just swears by something, because of personal bad experiences, or because of a phenomenon we can call fan-boyism. My advice is just to always get second opinion. Sometimes bashing is true, very true, like Intel graphics, VIA chip-sets back in the late 90'ies etc, but often bashing can be safely ignored.

The ATI brand is being gradually removed, to be replaced by AMD branding, since that's essentially what it is these days. There was a large reorganization of things after the purchase. The effects of this has been visible since the HD3nn0 generation, which was the first one designed under AMD leadership. As for drivers, my impression is the opposite of the bashers, that AMD/ATI have been generally better in later years. I also believe Microsoft have statistics that prove this as a fact.
I can't comment on GPUs burning up since this has never happened to me. I still have all my graphics cards, nVidia and AMD/ATI, back to an old GF 3 Ti200 and they all still work as well as they ever did.

Finally, as for running hot: How hot the chips run is a function of the card's cooling. This is something that is up to the cards manufacturer, and the individual model of the card, and has basically nothing to do with the brand of the GPU itself. So what should be bashed is the manufacturer of the card or the individual model of card itself, not the GPU or brand of GPU. The exact same goes for CPUs. Anyone who claim either Intel or AMD runs hotter, have no clue and should not be listened to. If the CPU runs hot, then it's a failure of the PC manufacturer who have provided inadequate cooling, not a failure of the CPU.

What is certain though, is that nVidia GPUs use more Watt and generate more heat than AMD. How hot the chip is running though, just depends on the cooling.

P.S. One thing that should be kept an eye on, in the context of overheating components, is dust buildup in the cooling fins of coolers, as well as faltering fans.


Pages: 1 [2]

This thread is older than 90 days and has been locked due to inactivity.


 


The new forums are live

Please adjust your bookmarks to https://forums.eveonline.com

These forums are archived and read-only