Author |
Topic |
 Lord Evangelian Gallente The White Mantle Malum Exuro |
Posted - 2010.11.17 15:29:00 - [ 1]
I recently upgraded my rig.
My old PC: 12.2 avg fps
My Laptop: 25.8 avg fps
My new PC: 79.4 avg fps
Whats yours? |
 CCP StevieSG

 |
Posted - 2010.11.17 15:30:00 - [ 2]
Moved to Out of Pod from EVE General. |
 Lord Evangelian Gallente The White Mantle Malum Exuro |
Posted - 2010.11.17 15:32:00 - [ 3]
Im talking about in game... why is that out of pod? |
 Nicholas Barker Deez Nuts. |
Posted - 2010.11.17 15:39:00 - [ 4]
Originally by: Lord Evangelian Im talking about in game... why is that out of pod?
women and computers. *trollface* |
 Lord Evangelian Gallente The White Mantle Malum Exuro |
Posted - 2010.11.17 15:42:00 - [ 5]
Originally by: CCP StevieSG Moved to Out of Pod from EVE General.
Any Clarification Stevie? |
 Vagilicious |
Posted - 2010.11.17 15:56:00 - [ 6]
Edited by: Vagilicious on 17/11/2010 16:11:40My FPS of choice is currently Fallout New Vegas until Dust514 is released lol  edit: FTR I know what you actually meant.. I haven't checked for a while.. (loads EVE) |
 Flap jak |
Posted - 2010.11.17 15:57:00 - [ 7]
Yeah, i have to agree with ya. not sure why this was moved to OOPE.  |
 Wendat Huron Stellar Solutions |
Posted - 2010.11.17 16:08:00 - [ 8]
Serves you right for tricking her into thinking you meant first person shooter. Also reading the actual OP before commenting is for nubs. |
 Akita T Caldari Navy Volunteer Task Force
|
Posted - 2010.11.17 16:11:00 - [ 9]
Edited by: Akita T on 17/11/2010 16:17:38... This is about the COMPUTER you have, not about the game itself (it's not any aspect of gameplay, politics, tech support or anything like that), so it DOES belong in OOPE (even if the mod might have thought you meant first person shooter not frames per second) ... Old PC before it fried its vidcard (32bXPSP3, pentium E2140 1.6GHz dual core, 3.5GB of 4GB 666MHz dual channel DDR, 8500GT with 512MB) : ~33 FPS with bloom/hdr/shadows off (IIRC) Same old PC with replacement weaker card (Radeon HD 4350 with 512MB) : ~28 FPS under same conditions (IIRC) New PC (64bWin7Ult, i5 760 2.8GHz quad core, 4GB 1333MHz XMP dual channel DDR2, 460 GTX with 1GB) : ~250 FPS with bloom/hdr/shadows on, ~ 150 FPS when also forcing max AA settings through tricky 3rd party stuff ( Linkage) All of those are in windowed mode, 1600x1024 on 1600x1200 desktop, using EVE-Mon repositioning. Obviously, that's for benchmarking only, for regular gameplay I always turn vsynch on ("interval one"), so that means I'm "stuck" at max 85 FPS (85Hz is my CRT's refresh rate at that resolution ; for most LCDs, it would be just 60). |
 GUBZZ |
Posted - 2010.11.17 16:19:00 - [ 10]
I get a solid 60 FPS on max settings at 1680x1050
Although this drops to 30 FPS when I encounter clouds of gas
God damn gas clouds |
 Grez Neo Spartans Laconian Syndicate |
Posted - 2010.11.17 16:21:00 - [ 11]
210 fps or so with all settings on or maxed where available. |
 Michael Kuiper |
Posted - 2010.11.17 17:12:00 - [ 12]
60 FPS, exactly as it should be with any game.
vsync or bust.
a note:
If your frame rate exceeds the refresh rate of your LCD, your not only wasting frames, but causing ugly horizontal screen tearing. (if your FPS is 100+ it most certainly looks much much worse than 60 fps vsyncd). |
 Zagdul Gallente Clan Shadow Wolf Fatal Ascension |
Posted - 2010.11.17 18:20:00 - [ 13]
Edited by: Zagdul on 17/11/2010 18:33:00Edited by: Zagdul on 17/11/2010 18:29:50 Originally by: Michael Kuiper 60 FPS, exactly as it should be with any game.
vsync or bust.
a note:
If your frame rate exceeds the refresh rate of your LCD, your not only wasting frames, but causing ugly horizontal screen tearing. (if your FPS is 100+ it most certainly looks much much worse than 60 fps vsyncd).
Your logic is flawed. While it doesn't pertain to EVE, in other games where frames matter, having insane FPS is a must as the more frames drawn per second to align objects (such as cross hairs) becomes critical in a win/lose situation. Think of it this way... The human eye sees at ~30 fps or at least, it begins to recognize and is tricked into seeing motion at roughly this rate. Your GPU is drawing at 60 FPS. During the timeframe of a second, there are instances where the 30 fps and 60 fps don't synch and it's translated by the human eye as "lag" or stutter. A keen eye can see this and will be an annoyance more frequently than that of someone who isn't accustomed to looking for it. In other games such as first person shooters, eliminating the stutter or "lag" is imperative and skilled FPS players seek numbers in the triple digets as a base. Also, there's no such thing as "wasted" fps. There is overheating your graphics card because it's not designed to run EVE at insane frames. I usually keep my interval set to one, not because my computer can't handle it, but rather I'd like to extend the life of it. To answer the OP, with my Interval set to max in EVE, I average 140 FPS with all effects on and hdr/bloom/shadows off. With two clients running in windowed, this cuts down to about 110-80 fps. When I go up to 3-4 clients running, I begin to have CPU issues and multi tasking is rough since I'm still on a dual core. Specs: AMD Athalon X2 5400+ Black Edition (OC'd to 3ghz) 4 GB Patriot 5-5-5-12 PC 6400 "Gamer Series" nVidia 9800 GT last driver release (new ones are crap). 64 GB OCZ SSD EDIT: Also, Tearing and V-sync are two different things. Solved with the same "setting" but completely different issues. Tearing is caused when a game engine tries to render distance and doesn't do it seamlessly. Many games are rendered by section where the "closest" to the player is of a higher, more crisp quality and further objects are rendered with less quality and care. The breaking apart of these "sections" can cause tearing when moving left < > right quickly. V-Sync, or de-sync for that matter, happens on CRT's when the cannons which shoot the tube in your monitor aren't keeping up with the FPS which the GPU is drawing at. |
 Merin Ryskin Peregrine Industries
|
Posted - 2010.11.17 18:39:00 - [ 14]
Originally by: Zagdul Your GPU is drawing at 60 FPS. During the timeframe of a second, there are instances where the 30 fps and 60 fps don't synch and it's translated by the human eye as "lag" or stutter. A keen eye can see this and will be an annoyance more frequently than that of someone who isn't accustomed to looking for it. In other games such as first person shooters, eliminating the stutter or "lag" is imperative and skilled FPS players seek numbers in the triple digets as a base.
Translation: idiots who don't know what "refresh rate" means demand 100+ FPS. Let me put this in simple terms: Your LCD screen has a maximum refresh rate of X, which is a limitation of its physical properties. There is no way to increase it without buying a new LCD. Your video card is capable of drawing frames at a rate of Y. It doesn't matter if Y is greater than X, because your screen is not capable of displaying the frames as fast as your video card is drawing them. All that happens is your video card works too hard (or you scale the graphics down to get OMG FPS), and the extra frames are discarded. Anyone who obsesses over OMG FPS as anything other than bragging about how overkill their hardware is is an idiot who doesn't understand that LCDs have a much lower refresh rate than the old CRTs which used to benefit from (and demand) OMG FPS. |
 Kazuo Ishiguro House of Marbles
|
Posted - 2010.11.17 18:41:00 - [ 15]
That, or they assume that everyone else they talk to has CRTs or expensive 120Hz LCDs. |
 Michael Kuiper |
Posted - 2010.11.17 18:48:00 - [ 16]
Edited by: Michael Kuiper on 17/11/2010 19:47:21 Zagdul, interesting response. I do appreciate other people opinions but you have perked my curiousity.
60 FPS is around the general area where the human eye cannot detect additional frames drawn per second, which is why 60 fps or 100+ fps looks equally smooth the the human brain (putting other factors such a vsync aside).
Frame rate to me matters everywhere. All games, sidescrollers, movies, cartoons, you name it.
You say "During the timeframe of a second, there are instances where the 30 fps and 60 fps don't synch and it's translated by the human eye as "lag" or stutter"
Are you referring to triple frame buffering in VRAM or an actual degregation in GPU performance from 60fps to 30fps and back with vsync enabled? Typically if the fps drops from 60 to 45 to 30 because the target fps cannot be acheived then you are correct, the fps does drop. However, studies have always shown that this is less emersion breaking than horizontal screen tear. Your speaking to someone here who has a "very keen eye" so im trying to see your perspective on vsync vs no vsync. I do believe there is such a thing as wasted feframes if they are never drawn to screen. I also think we have a different under standing of what vsync is. Vsync enabled in a video driver ensures that the frame is drawn completely to vram before being drawn to screen, this is apparant on any type of display. CRT, LCD, OLED, you name it. Im not sure what this de-sync is you are talking about.
Horizontal screen tearing has nothing to do with drawing distances and everything to do with vysync not being enabled, and only portions of the screen being drawn per screen refresh. Horizontal screen tearing has nothing to do with the content of the frame itself, and everything to do with the frame buffer.
You are somewhat correct about one thing, vsync does cause a very slight amount of lag (not to be confused with stutter)IF you are using triple frame buffering. That delay is exactly 2/60ths of a second.
Edit: screen tearing can only occur if an entire frame is not drawn to screen.
Edit: i think your confusing vsync with Anistropic Filtering. |
 Running missions |
Posted - 2010.11.17 18:50:00 - [ 17]
30 on laptop, just where it should be <3 |
 Akita T Caldari Navy Volunteer Task Force
|
Posted - 2010.11.17 22:30:00 - [ 18]
Edited by: Akita T on 17/11/2010 22:40:47 Originally by: Michael Kuiper 60 FPS is around the general area where the human eye cannot detect additional frames drawn per second
That's NOT how the human eye works. The human eye (plus associated brain processing area) operates in an analog//smooth, not a digital//discreet fashion. Humans CAN detect something that "flashes by" for as little as 1/300 of a second, but it will have trouble identifying just what that was (and even less so to determine what colour it had). The best experiment for this is to take a flat spinning top and paint half white, half black. Spin it slowly somewhere at sunlight, you see it still as white or black in parts, but spin it fast enough and it starts to look grey all over. Any other colour mix works the same. Alternatively, do the same but with your own hand shaking faster and faster. Now, do the same experiment, but in artificial light (incandescent and fluorescent will yield different results, and the country you are in will also determine how the end result looks like since some have 220V at 50Hz, others 110V at 60Hz, and there's probably some others too). You'll see what I mean quite easily. Now, on to existing formats... For "classic" TV, you have PAL/SECAM with 25 FPS and NTSC with 30 FPS (more precisely 29.97)... well, they're actually 50 and 60 FPS respectively, but interlaced, so, meh. By the way, a lot of console games are actually locked (in a similar fashion to vsynch on a PC) at 30 FPS for NTSC and 25 FPS for PAL. The vast majority of cinema movies (the "classic" kind, the one on celluloid film) were actually at 24 FPS with no interlacing. Also, NOT turning vsynch on will mostly have more drawbacks than advantages overall, since your screen simply CAN NOT DISPLAY more distinct full images per second than your screen's refresh rate anyway. What you'll actually get if FPS > monitor refresh rate and vsynch off will be * EITHER (on very few monitors) a lot of dropped frames which you'll never actually see (which might improve responsiveness, but then again, you could get very similar results by lowering the "maximum CPU pre-rendered frames" setting in your video card drivers from the usual default of 3 to 1 or even 0 and disabling triple frame buffering, then turning vsynch back on) * OR (on most monitors) some of the images displayed being composed out of two or more images, in vertical slices (so, for instance, monitor frame 1 being composed of PC frame 1 on top, PC frame 2 in the middle and bottom plus PC frame 3 a little bit at the bottom, then monitor frame 2 being composed of PC frame 3 on most of the top and PC frame 4 on the bottom, and so on and so forth, depending how PC frame compute time changes from frame to frame) which is obviously BAD Bottom line, you can have a very fluid-looking and responsive 30 FPS game experience on a 60Hz monitor (especially if the game employs little "motion blur" tricks and you tweaked the drivers as instructed), or the game could appear to be sort of stuttering and being slightly unresponsive even with a 120 Hz monitor and 120 FPS (on particularly hectic games with no "motion blur" whatsoever and with the "wrong" driver settings). |
 Barakkus |
Posted - 2010.11.17 22:59:00 - [ 19]
OVER 9000!!!!!!!!
...sorry...had to... |
 Alvar Ursidae Amarr Kangaroos With Frickin Lazerbeams
|
Posted - 2010.11.17 23:47:00 - [ 20]
Max everything, with AA independantly run through drivers, I get smooth everything and 250fps. |
 thatbloke Gallente |
Posted - 2010.11.18 10:21:00 - [ 21]
The thing that seems to have been missed here is that although your monitor may not necessarily be able to display more than 60Hz, and your eyes may not be able to pick out the detail on anything much higher than that, however, higher FPS means that the internal logic state of the game is also being updated - limiting your FPS to 60 by turning vsync on means that the game/application will only ever update its internal state a maximum of 60 times per second.
By removing this limit, although you may not necessarily SEE all of the updates on-screen, the game is utilising the available processing power to update its internal state, hence the game WILL feel much more responsive (even though you may not necessarily see every update).
This is why for me, vsync is ALWAYS the first thing to get turned off whenever I install a game. |
 Dray Caldari Euphoria Released HYDRA RELOADED |
Posted - 2010.11.18 10:51:00 - [ 22]
With or without v-sync I still suck balls at BC2.  |
 Carine Parnasse |
Posted - 2010.11.18 16:59:00 - [ 23]
Originally by: thatbloke The thing that seems to have been missed here is that although your monitor may not necessarily be able to display more than 60Hz, and your eyes may not be able to pick out the detail on anything much higher than that, however, higher FPS means that the internal logic state of the game is also being updated - limiting your FPS to 60 by turning vsync on means that the game/application will only ever update its internal state a maximum of 60 times per second.
By removing this limit, although you may not necessarily SEE all of the updates on-screen, the game is utilising the available processing power to update its internal state, hence the game WILL feel much more responsive (even though you may not necessarily see every update).
This is why for me, vsync is ALWAYS the first thing to get turned off whenever I install a game.
My brain hurts. How do you define responsive? Do you think that the controls will be smoother? Less lag? Something else? And how will forcing your graphics card to render frames for no reason achieve this? The game's 'internal state' isn't synchronized to the framerate. And... you are aware that all the information passes through the screen to your brain? Even if a higher framerate did make the game faster, the images on the screen, and thus your experience of them, won't change faster then the screen's refresh rate. |
 Sinister Dextor |
Posted - 2010.11.18 17:46:00 - [ 24]
V-Sync on. Because I'm not an idiot. |
 Grimpak Gallente Midnight Elites Echelon Rising |
Posted - 2010.11.18 17:57:00 - [ 25]
Originally by: Akita T Edited by: Akita T on 17/11/2010 22:40:47
Originally by: Michael Kuiper 60 FPS is around the general area where the human eye cannot detect additional frames drawn per second
That's NOT how the human eye works.
The human eye (plus associated brain processing area) operates in an analog//smooth, not a digital//discreet fashion. Humans CAN detect something that "flashes by" for as little as 1/300 of a second, but it will have trouble identifying just what that was (and even less so to determine what colour it had). The best experiment for this is to take a flat spinning top and paint half white, half black. Spin it slowly somewhere at sunlight, you see it still as white or black in parts, but spin it fast enough and it starts to look grey all over. Any other colour mix works the same. Alternatively, do the same but with your own hand shaking faster and faster. Now, do the same experiment, but in artificial light (incandescent and fluorescent will yield different results, and the country you are in will also determine how the end result looks like since some have 220V at 50Hz, others 110V at 60Hz, and there's probably some others too). You'll see what I mean quite easily.
Now, on to existing formats... For "classic" TV, you have PAL/SECAM with 25 FPS and NTSC with 30 FPS (more precisely 29.97)... well, they're actually 50 and 60 FPS respectively, but interlaced, so, meh. By the way, a lot of console games are actually locked (in a similar fashion to vsynch on a PC) at 30 FPS for NTSC and 25 FPS for PAL. The vast majority of cinema movies (the "classic" kind, the one on celluloid film) were actually at 24 FPS with no interlacing.
Also, NOT turning vsynch on will mostly have more drawbacks than advantages overall, since your screen simply CAN NOT DISPLAY more distinct full images per second than your screen's refresh rate anyway.
What you'll actually get if FPS > monitor refresh rate and vsynch off will be * EITHER (on very few monitors) a lot of dropped frames which you'll never actually see (which might improve responsiveness, but then again, you could get very similar results by lowering the "maximum CPU pre-rendered frames" setting in your video card drivers from the usual default of 3 to 1 or even 0 and disabling triple frame buffering, then turning vsynch back on) * OR (on most monitors) some of the images displayed being composed out of two or more images, in vertical slices (so, for instance, monitor frame 1 being composed of PC frame 1 on top, PC frame 2 in the middle and bottom plus PC frame 3 a little bit at the bottom, then monitor frame 2 being composed of PC frame 3 on most of the top and PC frame 4 on the bottom, and so on and so forth, depending how PC frame compute time changes from frame to frame) which is obviously BAD
Bottom line, you can have a very fluid-looking and responsive 30 FPS game experience on a 60Hz monitor (especially if the game employs little "motion blur" tricks and you tweaked the drivers as instructed), or the game could appear to be sort of stuttering and being slightly unresponsive even with a 120 Hz monitor and 120 FPS (on particularly hectic games with no "motion blur" whatsoever and with the "wrong" driver settings).
you forgot the bit where your gpu's lifetime will be extended since it's not in a rush to process everything at max speed, thus decreasing heat and energy consumption. so yes, Vsync, beyond increasing your visual quality, it will also make your card greener, even if it's an ATI  |
 Viral Effect Caldari BRAINDEAD Corp
|
Posted - 2010.11.18 18:19:00 - [ 26]
|
 Carodem Ohashi Caldari Harbingers Of Destruction
|
Posted - 2010.11.18 19:44:00 - [ 27]
In space: 410 FPS Avg In station: 470 FPS Avg
I keep it sync'd to 60 FPS though. |
|