open All Channels
seplocked EVE General Discussion
blankseplocked 60 fps restriction - why?
 
This thread is older than 90 days and has been locked due to inactivity.


 
Pages: 1 2 [3] 4 5

Author Topic

Elizabeth Joanne
Minmatar
New Angel Industries
United Federation Of Corps
Posted - 2009.04.17 19:34:00 - [61]
 

Quoting myself in post #3 in this thread:

Originally by: Elizabeth Joanne
Unless you have a CRT display or one of those new-fangled 120 fps panels, you'll never see more than 60 fps anyway. That's the refresh rate of most panels, and there's no way going beyond that.


Oh dear, I've created a monster.

*repents*


Zeba
Minmatar
Honourable East India Trading Company
Posted - 2009.04.17 19:35:00 - [62]
 

Edited by: Zeba on 17/04/2009 19:35:37
Originally by: Phantom Slave
There seems to be some misconception and alot of personal bias in this thread.

I prefer v-sync on. It's a personal preference. Some people prefer V-sync off, it's their choice.

***SCREEN TEARING, and what it means to you!***

Screen tearing can ONLY be seen on a moving object, faster moving objects means more tearing. If you're sitting in station just looking at your ship, you won't see tearing because there isn't alot of fast movement. Go outside the station and spin your camera around really fast. If your monitor is set at 60 Hz and your framerate is above that, you will notice that if you spin the screen fast enough the station will effectively be split up into parts for very VERY brief moments.

If you do not notice this with v-sync off then don't bother turning it on. If you DO notice it but it doesn't bother you then leave v-sync off. I notice it and it bugs me, so I use v-sync.

It's ALL personal preference. Some people just may not even notice it because they don't care. If you like v-sync then use it, if you don't then don't bother.

To those that are saying that v-sync lowers your system heat then you're not entirely correct. V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.


I'm sorry but did you just bring logic and a tolerant even handed view of other peoples opinions into an eve-o gd thread?


BURN HIM! Twisted Evil

Th0rG0d
Terminal Pharmaceuticals Inc.
Posted - 2009.04.17 19:38:00 - [63]
 

Originally by: Vaerah Vahrokha

I have 11/10 visus though.



Wow... So you can see something at 11 feet, that the "average" person can see at 10 feet?

The last time I had my eyes checked, the doctor didn't have any measuring equipment to determine better then 20/10. But really, anything better then 20/20 vision isn't necessary (unless you are into long range spotting), so bragging about your eyesite on the 'net just lowers you to my level....

Rolling Eyes <--- yes, that is my 20/10 eyes rolling Laughing

Polly Prissypantz
Dingleberry Appreciation Society
Posted - 2009.04.18 01:26:00 - [64]
 

Originally by: Grez
A game pumping out 150 fps, is running quicker, more responsive and generally doing everything it can/should do, a lot quicker than a game doing that at 60 fps, hence, a game running at 150 fps will feel more responsive, even if you can only see 60 fps from the monitor.

For example, the game is running on the computer, not the monitor. The hardware is plugged into the computer, not the monitor. Just because the monitor is displaying it at 60/75fps (in their most common settings), does not mean that the game is running at that speed.

Limiting a game to output at a certain frame rate can also cause detrimental effects to how the game is processes. Hence, a game can definitely, and will almost always feel, more responsive.

Those of us with computers that can handle the extra fps, are all free to do it, if you lock your FPS to your refresh rate using vsync, the computer is still going to be working as hard on other things, the only component that's not, is the graphics card, and they are designed to be worked at 100% (most gaming ones anyway).


Grez is correct. All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds. Eve is slow enough that you won't really notice the difference but in some faster-paced games there is a definite difference in how the game performs when limiting or not limiting the frame rate. We're not talking about how it looks.

This is an older, but still useful thread on the subject.

Liberal use of italics brought to you today by the Learn to ****ing Read Association.

Lothros Andastar
Gallente
Posted - 2009.04.18 01:31:00 - [65]
 

Minmatar scientists believe the 60 FPS limit is...

BECAUSE OF FALCON

Elizabeth Joanne
Minmatar
New Angel Industries
United Federation Of Corps
Posted - 2009.04.18 02:13:00 - [66]
 

So you all can see things that mere mortals can't. Impressive.

The next time I consider bringing up the 60 Hz rule I will think of you young pioneers who not only produce more frames per second on their 60 Hz monitors than anyone else, but also can see every one of them.

A little knowledge is a dangerous thing, as they say. Let's hope these people... wait, we have nothing to fear with these people.

Catherine Frasier
Posted - 2009.04.18 03:36:00 - [67]
 

Originally by: Polly Prissypantz
All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds.
Pomo nonsense. Correction. Italicized Pomo nonsense.

This is a mechanical system. It either displays more frames per second or it doesn't. It either processes more update or input cycles per second or it doesn't. There's nothing "feely" about it, period.

masternerdguy
Gallente
Meerkat Maner
Posted - 2009.04.18 03:37:00 - [68]
 

the human eye sees up to around 22fps, so anything above that is wasted anyway.

HankMurphy
Minmatar
Pelennor Swarm
Posted - 2009.04.18 04:26:00 - [69]
 

Originally by: Xianbei
wow the amount of armchair science in this thread is amazing

and how people can pound their chest and scream about something and be so wrong
is a testament to what the internet has become. the internet at your very fingertips
and you cant even be bothered to research before you post.

i was going to offer some constructive info and links but really there is no point

you can lead an idiot to information but you cannot make him un-stupid


THIS x1000

in the spirit of the above, i would like to constructively suggest that the OP change out their cat5 cable with a cat5e or cat6 at the least, preferably one with gold ends to reduce latency. this should vastly improve your fps








Very Happy
trust me, i seem legit

Rathelm
Posted - 2009.04.18 06:01:00 - [70]
 

Originally by: Grez
Not being able to see more than 60 fps from your monitor is a myth, and anyone who says otherwise is just proving they know absolutely nothing.

Turn on advanced options in your graphics menu in EVE and change "Interval Default" to "Interval Immediate". It effectively turns vsync off, whereas now, it's set to driver default.


As much as you'd like to think that monitors work by magic it's simply not the case. Your monitor's refresh rate is just that. Every refresh it redraws the screen. If your monitor's refresh rate is 60 HZ that means it redraws the screen 60 times a second. Therefore the absolute max that the video buffer will be drawn to the screen is 60 times a second. Without verticle sync enabled you effectively create a scenario where screen tearing will occur. This happens because as the monitor is refreshing the video buffer sends a new screen to the output device (your monitor) and it will start drawing the new screen from where it's drawing from mid-refresh. Fast moving objects could therefore be torn.

No matter how often you force the video buffer to be updated it can only be drawn to the screen when the monitor is in a refresh cycle.

Rathelm
Posted - 2009.04.18 06:06:00 - [71]
 

Originally by: Phantom Slave
V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.


To add to this you have to have a back buffer. That's the way Direct3D works, and I'd imagine OpenGL too but I'm not familiar with that API. All drawing done is done to the back buffer. The back buffer should only be flushed to the output device when it is actually being refreshed. Disabling V-sync shouldn't even be an option that developers and video card makers give you. The only reason they do is to sell more expensive video cards.

Rathelm
Posted - 2009.04.18 06:20:00 - [72]
 

Originally by: Catherine Frasier
Originally by: Angelik'a
It's usually the feeling of smoothness rather than anything actually picked up by your eye. For example if you're playing a first person shooter at 25fps (why any more, your eye cant see it anyway amirite?)
No you're not "rite". Your eyes process a continuous stream of information and can easily detect the difference between 25 fps and 60 fps when there is motion in the image.

The "feeling of smoothness" comes from what your eyes (and brain) perceive (what else could it possibly be)?


That has more to do with the way DirectX works. When you tell DirectX to open up an ID3D9Device and an IDXGISwapChain part of the intialization of the graphics engine is how many frames per second you're trying to achieve through the swapchain. If you're achieving less than that, that means the computer is chugging to try to keep up with what's going on. It also causes timing issues with your input. Another important thing is there are multiple ways to do input and if you tie input to your refresh rate it will drag it down even further.

Pottsey
Enheduanni Foundation
Posted - 2009.04.18 09:38:00 - [73]
 

masternerdguy said "the human eye sees up to around 22fps, so anything above that is wasted anyway."
I did a number of blind tests and every single person could tell the difference between 30, 60, 100 and 120fps. Could not go higher as 120 as that was the max my screen could display. It's easy to test just set half the screen to 30fps and half to 60fps and see if different people can spot a difference.

Anyone who cannot see the difference between 30 and 60fps or cannot see beyond 60fps has poor eye sight or a screen that cannot display more then 60fps.

Emperor D'Hoffryn
EXTERMINATUS.
Nulli Secunda
Posted - 2009.04.18 12:05:00 - [74]
 

Originally by: Catherine Frasier
Originally by: Polly Prissypantz
All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds.
Pomo nonsense. Correction. Italicized Pomo nonsense.

This is a mechanical system. It either displays more frames per second or it doesn't. It either processes more update or input cycles per second or it doesn't. There's nothing "feely" about it, period.


Actually, his argument is poorly worded, but what he's on about is the feedback delay. When you v-sync lock to 60fps, you are guaranteeing that when you change your input to the game, you will see the results of that change in input no faster then 1/60 of a second later, on average. (IE, your FPS character starts to turn)

If you are running a significantly higher frame rate, its possible that you will get a partial frame of this reaction to input sooner. This is similar to the problem with some cheapo LCD screens with monitor lag, only much reduced.

Does it matter for EVE? No. Can he really tell? I would argue its most likely in his head, at the time spans we are talking, but monitor lag is quite real and you will notice once you get a bad LCD screen. Its quite possible that the update delay with monitor lag (ALL LCDs have it, just some are really bad) amplifies the effect somehow.

Since none of this matter for EVE, and EVE has some super simple screens that go wonky with no limitations (login screen, inside stations) use the eve graphic settings to turn on v-sync, and leave it off in your driver settings for other games. If you ever have to go afk suddenly, and you are docked when you do so, your video card will thank you for it.

Karentaki
Gallente
Oberon Incorporated
Morsus Mihi
Posted - 2009.04.18 12:10:00 - [75]
 

Originally by: Gabriel Loki
Originally by: Lonzo Kincaid
what's the frame rate for human eyes?


They dont have one.


This is true, but for something to appear as continuous motion it needs to be at least about 15FPS. Higher framerates simply make the image smoother, but anything beyond 60 isn't really noticeable.

Draeca
Tharri and Co.
Posted - 2009.04.18 12:14:00 - [76]
 

Originally by: Red Wid0w
Yeah listen to what people are telling you. YOU NEVER HAD 150 FPS. YOUR MONITOR CAN ONLY DISPLAY 60 (more than likely). All you are doing with your 150fps is wasting processing power, thus reducing the responsiveness of your background apps. Turn VSYNCH on and lock your fps to 60 and there WILL BE NO DIFFERENCE TO EVE. However, everything else on your pc will run faster.

Oh, and those 120hz LCDs are rubbish, they interpolate between frames basically.


There is difference, I had this strange "jumping" (kinda like lag, but a lot smoother) when spinning the camera with vsync enabled. Turning the vsync off fixed it and now eve's back to normal.

Elizabeth Joanne
Minmatar
New Angel Industries
United Federation Of Corps
Posted - 2009.04.18 15:07:00 - [77]
 

Anyone foaming at the mouth about frame rates and their effect on reaction time etc. is conveniently ignoring the real world.

As it has been established there is no way to have a 60 Hz monitor display more than 60 frames per second, this is just one part of the equation when it comes to reacting to what is being displayed.

On the surface, it looks like a 60 fps display allows you to see what is happening with a delay of at most ~17 milliseconds (1/60th of a second), and that this delay could be reduced by increasing the frame rate.

Alas, not so. TFT displays also have this phenomenon called input lag that isn't often discussed because it can be outright horrendous. The Dell 2707WFP for example has 46 milliseconds of input lag. This means you are effectively at least 2 frames in the past at all times.

If you are worried about frame rates and its effect on your reaction times, choose a monitor with a very low input lag first. TFT Central has measured the input lag on a bunch of models. Manufacturers typically don't advertise this figure.

Pan Crastus
Anti-Metagaming League
Posted - 2009.04.19 00:17:00 - [78]
 

Originally by: Phantom Slave

To those that are saying that v-sync lowers your system heat then you're not entirely correct. V-sync produces 2 frames for every frame the monitor shows, and if you have Triple Buffering on then you're producing 3 frames for every 1 that shows. It helps keep your framerate smooth because if something happens that slows down the GPU then it has extra frames to fall back on.


Please stop posting such nonsense. Or at least read up on this stuff before you try to post such gibberish. V-Sync does not "produce 2 frames for every 1 that shows", neither does tripple-buffering produce 3.

V-Sync is commonly used with a double buffer / back buffer and what happens is that while one image (buffer) is displayed, the next one is drawn in the other buffer. When that one is finished, the GPU pauses until the next V-sync and the buffers are switched (the one just drawn is displayed, the other is used for drawing).

For triple-buffering, instead of pausing when the hidden buffer is drawn, the 3rd buffer is drawn.

All these methods were more interesting when fps was (sometimes / usually) lower than the v-sync frequency, when it isn't and a frame can be drawn within a v-sync always, using one hidden buffer is enough and the graphics card will pause anyway because you only need 60 (or whatever your monitor wants) frames per second.

Conclusion: only idiots switch off v-sync unless the program is bugged to hell. All it does is stress your gfx card and CPU more and sometimes, as a consequence, various parts of your system (like sound/mouse) will suffer.

Astigmatic
Posted - 2009.04.19 00:29:00 - [79]
 

Originally by: johny B5
before the patch, i had 150 fps and higher with the highest settings, now i only have about 60 fps. why? please remove these stupid restrictions. this is not doom 3.

i recently bought a new grafic card, and cannot make use of it. i like my old fps back, wether i am able to see them or not. when i am in a battle situation with less frames than 60 frames, this is not acceptable.

ccp, please remove this useless restriction fast!


You now know how to remove it. Switch to Interval Immediate. Now burn out your gfx card and whine elsewhere, All the armchair techs, please follow suit or set up a business. Let's get on with actually discussing Eve mechanics rather than technical ineptitude affecting some of the playerbase.

Clementina
The Scope
Posted - 2009.04.19 00:48:00 - [80]
 

Thanks to this thread, I changed my game from Interval Immediate to Interval One. Eve uses less CPU time according to Windows Task Manager, and I am not noticing a difference in appearance.

Jana Clant
New Dawn Corp
New Eden Research.
Posted - 2009.04.19 01:21:00 - [81]
 

Originally by: Polly Prissypantz
Grez is correct. All you geniuses going on about not being able to see more than 60fps are missing the ****ing point. It's not about what you see, it's about how the game feels and responds.


Unless you are somehow telepathically connected to your computer, what you see through your monitor is how the game feels and responds. The fact that you have a bunch of frames being rendered by the graphics card and not being used at all is irrelevant, so assuming V-Sync is off and your refresh rate is 60 Hz, it won't make a shred of difference if your graphics card is rendering 60 or 150 frames per second. Any "feeling" you might have about it is purely psychological due to having the frame rate displayed.

Originally by: Polly Prissypantz
Eve is slow enough that you won't really notice the difference but in some faster-paced games there is a definite difference in how the game performs when limiting or not limiting the frame rate. We're not talking about how it looks.


I completely agree that V-Sync has a detrimental effect on your frame rate. (even if that only applies if your card frame rate is lower than the refresh rate) However, that difference you claim to "feel" about how the game "performs" is utter crap, what you're seeing is the screen showing you the same frame twice because your graphics card didn't have a new one ready yet, but that's not the game feeling slower, it's the game looking slower, so yes, you are talking about how it looks, and nothing else.

Originally by: Polly Prissypantz
This is an older, but still useful thread on the subject.


I suggest you read it again because you don't seem to understand what's said there.

Originally by: Polly Prissypantz
Liberal use of italics brought to you today by the Learn to ****ing Read Association.


Right back at ya, with sporadic use of red, bold and larger font for dramatic effect.

Sjobba
Posted - 2009.04.19 03:45:00 - [82]
 

Edited by: Sjobba on 19/04/2009 03:48:33
Edited by: Sjobba on 19/04/2009 03:46:20
Realizing this has been explained a number of times already, I will put it out there once more...
(I actually did some researce on this last time I decided to teach myself game programming Razz)

First:
The amount of FPS you can see is limited by your monitor.
If your screen can only show 60fps, rendering at a higher rate will NOT make the game smoother... the extra frames will either just be dropped or fragmented into other frames (visible as screen tearing).

Second:
Whether or not your game is rendering at 60, 150, 250, or a 1000 frames per second... it will NOT change how smooth the game feels. Window positions and such are all calculated on different threads, separately, away from the graphics.
(Assuming the programmers designing it used multi-threading, which is a fair assumption, really)

Meaning, having a higher fps will NOT make moving windows and such *feel* smoother... The positions will simply be displayed where it is whenever the GPU gets around to rendering the next frame... The actual position of the window is NOT calculated when the frame is rendered. (As was the case with single-threaded applications.)

If anything, higher FPS (above the monitor max) will decrease the smoothness of the game, as it puts more strain on the hardware to render the graphics, leaving less resources available for the rest of the threads.

Note however, that, obviously, if your computer can only handle 30fps, odds are that the game will be a lot less smooth than on a computer that can handle 150fps. (But not because of the FPS alone, as explained above.)

Third:
The fact that your computer can render at 150fps, 250fps, etc... is not impressive.
A high FPS does not mean your hardware is uber. If anything, it just means it's configured inefficiently, and that you will have to replace it sooner than you would otherwise have to.

Edit: spelling and minor nitpicks Embarassed

MechaViridis
Amarr
The Program
Vanguard.
Posted - 2009.04.19 06:34:00 - [83]
 

Edited by: MechaViridis on 19/04/2009 06:37:59
If any of you guys have played competitive counter-strike, quake, etc, you will know that fps higher than 60 DOES matter. It has something to do with netcode/extrapolation of movement, so for eve this isn't as important. HOWEVER, with V-Sync on, your FPS will drop more if it goes below 60 than your fps would be if you didn't have it on. Go into a big fleet battle with v-sync on. You will have lower fps because some of the frames will still be out of sync and causing more latency. Now turn off v-sync and go into same situation. It will most likely be higher.

ALSO for everyone saying "Oh your GFX card will die sooner if you run a game at 350fps rather than 60...," this is complete nonsense. Silicon does not just 'die'. The most common reason cards die is a fan burns out and causes to card to overheat. And there is really no heat difference between 60fps and 350. Download RivaTuner and open the temparature logger then test it for yourself. You won't see a difference.

So in conclusion: v-sync on or off ITS YOUR PREFERENCE...THERE ARE PROS AND CONS TO EACH

Alexander Nergal
Three Pony
Posted - 2009.04.19 09:31:00 - [84]
 

Seems like everytime an issue, legitimate or not, is raised on these forums all that follows is angry, angry posts by mad people. Everything is a "Whine". Its goddam BS.

sidenote, I can feel the difference between 200 and 300 fps so I guess I'm just a god Rolling Eyes

Feilamya
Pain Elemental
Posted - 2009.04.19 09:37:00 - [85]
 

Originally by: Sjobba
Second:
Whether or not your game is rendering at 60, 150, 250, or a 1000 frames per second... it will NOT change how smooth the game feels. Window positions and such are all calculated on different threads, separately, away from the graphics.
(Assuming the programmers designing it used multi-threading, which is a fair assumption, really)


Actually not.
Multi-threaded programming is a *****, especially when your application performs a lot of inherently sequential stuff. And in games, there is a lot of inherent sequentiality.

For example, the game client can not look into the future and render the next 10 or so frames in parallel to the rest of the game mechanics, because it has free GPU resources. What is shown on the next 10 frames depends on user input and what the server sends to the client.

There used to be a good article about this, but I can't find it any more. The following seems to describe the same problem:
http://www.gamasutra.com/features/20051117/gabb_01.shtml

EVE is an old game. For most old games, it's actually a very safe bet that they work entirely single-threaded, except maybe for one or two dedicated threads that handle sound rendering or networking (and I doubt EVE uses any amount of multi-threading for networking). Some may actually use a dedicated graphics rendering thread, which would mean your argument holds.


So what impact does FPS have on the performance of single-threaded games?
It depends!
If V-sync is on, the CPU waits up to 1/60 seconds for every frame displayed. This time can not be used for doing other stuff, like networking, handling user input, etc.
(Well, it can. But this requires careful programming, or otherwise the single-threaded game loop might miss the next V-sync)
If V-sync is off, the CPU does not wait for V-sync, but it is busy rendering additional, useless frames that will only result in garbage on the screen (tearing). Frames are not rendered entirely on the GPU. The CPU still has to do a lot of work on its own. So this time is wasted and can not be used by the CPU to do useful stuff.

So the bottom line is that the game can run more or less smoothly when V-sync is on. It depends how well it is implemented. Smart game developers will probably optimise for V-sync enabled, because it simply looks better.

Smart Counter-Strike developers, on the other hand, will optimise for V-sync off, because they know their customers think high FPS = more E-peen.

Borne Seller
Posted - 2009.04.19 11:41:00 - [86]
 

Originally by: johny B5
Originally by: NeoTheo


you have your interval set, that locks the FPS to same as your refresh rate of your monitor.

change it back to imediate.




thanks, it worked! I didn'd know these settings before and didn't change them. that's why i was confused.
Here's how to do it:
1. Go to Grafiks & Displays
2. Check box "Advanced settings"
3. Set "Present intervall" to "Intervall immediate"



Thank you. I was pulling my hair out trying to understand why. Cheers m8

Sjobba
Posted - 2009.04.19 20:58:00 - [87]
 

Originally by: Feilamya
EVE is an old game. For most old games, it's actually a very safe bet that they work entirely single-threaded, except maybe for one or two dedicated threads that handle sound rendering or networking (and I doubt EVE uses any amount of multi-threading for networking). Some may actually use a dedicated graphics rendering thread, which would mean your argument holds.

True, but keep in mind that the EVE graphics engine was pretty much remade from the ground up with Trinity, so the premium client we use today is not really that old.

Although, the fact that fetching data from the market freezes the entire game would suggest it might be less threaded than I was picturing Neutral

Originally by: Alexander Nergal
sidenote, I can feel the difference between 200 and 300 fps so I guess I'm just a god Rolling Eyes

You expect there to be a difference between the two, so you see it whether or not it is there.
There is a psychological term for this... which I can't quite remember at the moment.

I mean... I could swear my computer booted up twice as fast after I upgraded my old SATA drive to SATA II... until I realized I messed up the cables and was still booting of my old SATA drive. Laughing
(It did feel a bit odd, getting a new HDD partitioned exactly like my old one Smile)

Agent Known
Posted - 2009.04.19 21:08:00 - [88]
 

It's actually a good thing they set it to Interval One by default...people were complaining that EVE was burning out their fancy-pants graphics cards.

Yes, increased FPS gives you a better buffer, but at the cost of increased heat and load on the graphics card (CPU is NOT affected by this as much).

Also, if you have an old CRT @ 60Hz, you'll probably see it flickering a lot and also strain your eyes (if a light's on). This proves the theory of what the eye "sees", and 60Hz is visible light (which causes the flicker). Increasing the Hz drops the flicker since the light and your monitor will refresh at different rates. This isn't true with LCD monitors because of the way they function (backlight w/crystals that block, not generate, light to create an image).

/me is done

Astigmatic
Posted - 2009.04.19 21:14:00 - [89]
 

Originally by: Sjobba
You expect there to be a difference between the two, so you see it whether or not it is there.
There is a psychological term for this... which I can't quite remember at the moment.



Let me help you. As the person you are referring to is Alexander Nergal regarding the difference between 200 and 300fps which is only possible in really top end machines with code that specifically supports it and is so recent there is no historical data to support it's existence. He's nuts.

gamertrav
Trauma Ward
Posted - 2009.05.03 10:36:00 - [90]
 

Edited by: gamertrav on 03/05/2009 10:36:48
Originally by: Elizabeth Joanne
Alas, not so. TFT displays also have this phenomenon called input lag that isn't often discussed because it can be outright horrendous. The Dell 2707WFP for example has 46 milliseconds of input lag. This means you are effectively at least 2 frames in the past at all times.


Yep, and it should be noted that 46ms is more than the network lag you will generally have in an online FPS, so you are basically doubling (or even tripling) the amount of lag you perceive.

Originally by: Elizabeth Joanne
If you are worried about frame rates and its effect on your reaction times, choose a monitor with a very low input lag first. TFT Central has measured the input lag on a bunch of models. Manufacturers typically don't advertise this figure.



The thing is, I chose a LCD with 2ms lag, and it's still very noticeable in games (even in eve) when I enable vsync. Maybe the manufacturer just lied about the lag, but I have 2 other LCDs in my house that are even worse.

It's less of a problem in eve of course because eve is a lot less dependent on split second reactions, but it still annoys me when moving the camera around. The real problem for me in Eve though is that when running 2 clients with v-sync on, they sometimes drop to very low FPS (like 20fps...) for some reason, but with vsync disabled this never happens.


Pages: 1 2 [3] 4 5

This thread is older than 90 days and has been locked due to inactivity.


 


The new forums are live

Please adjust your bookmarks to https://forums.eveonline.com

These forums are archived and read-only