open All Channels
seplocked EVE General Discussion
blankseplocked 60 fps restriction - why?
 
This thread is older than 90 days and has been locked due to inactivity.


 
Pages: 1 2 3 [4] 5

Author Topic

Astria Tiphareth
Caldari
24th Imperial Crusade
Posted - 2009.05.03 11:08:00 - [91]
 

Edited by: Astria Tiphareth on 03/05/2009 11:08:40
This is bizarre. You do realise you're arguing about a figure that has no relevance to gameplay in EVE whatsoever?

Fact 1. Server operational cycle in EVE is one second, that's 1000 milliseconds. No matter how fast you see and respond to visual input, I guarantee you that the network latency & server cycle time combined will be slower than your reaction time.

Fact 2. A monitor cannot display frames faster than its own refresh rate.

V-sync on or off makes no practical difference whatsoever to your visual response time. The only distinction is whether you get screen tearing or not. That 150 fps is mostly going to waste as heat in your graphics card, because you are most definitely not seeing the benefit.

The idea that you're not making use of a new graphics card because it doesn't go above 60 fps is so mindbogglingly uninformed, I don't know where to begin. Have you considered that it can do more in that 60th of a second than an inferior graphics card?

If you're in a battle with less than 60 fps, you're an idiot for thinking that v-sync is even remotely related. If you have less than 60 fps, you have less than 60 fps. The card is working as fast as it can. That's what bloody v-sync means! Sync to refresh rate. If it's not managing that, what the hell do you think it's doing, twiddling its thumbs?

Astria Tiphareth
Caldari
24th Imperial Crusade
Posted - 2009.05.03 11:20:00 - [92]
 

Edited by: Astria Tiphareth on 03/05/2009 11:36:05
Edited by: Astria Tiphareth on 03/05/2009 11:21:10
Originally by: MechaViridis
If any of you guys have played competitive counter-strike, quake, etc, you will know that fps higher than 60 DOES matter. It has something to do with netcode/extrapolation of movement, so for eve this isn't as important.

No no no. It's what Feilamya outlined. You're making a logical link where none exists.

The issue with going above 60 fps in Unreal Tournament etc. has nothing to do with FPS & v-sync and everything to do with game design. Those older first person shooters are single-threaded by and large. Your network code, physics calculations, graphics code, everything is in a single massive loop in code.

If you configure for v-sync, you require that the game code delays the next graphics frame being pushed to display until the next refresh comes around. Badly or lazily written loop code will mean that everything else waits for that as well.

Whilst it is feasible to write code that ensures a timely 60 FPS whilst letting physics go nuts and calculate as often as possible, it's not trivial, which is why multi-threading was created, because in truth, despite its inherent complexity, multi-threading is easier and scales better than coercing a single loop to do it.

The monitor physically cannot display anything faster than its refresh rate. At best you'll get a screen tear and a frame a 60th of a second sooner (assuming 60Hz refresh).

If your personal response time is that good, then more power to you. However, I'd point out that that response time is 16ms, and that's why the other impacts of lag are so critical to take into account. On a LAN Counter-strike game where super-low ping of 4ms or less is desirable, your response time and monitor refresh may just possibly have an impact. In EVE? Not a chance. Not ever. It's physically impossible. The network & server lag is an order of magnitude larger.
Originally by: gamertrav
The thing is, I chose a LCD with 2ms lag, and it's still very noticeable in games (even in eve) when I enable vsync. Maybe the manufacturer just lied about the lag, but I have 2 other LCDs in my house that are even worse.

They often don't tell you everything. It's much better to try a monitor before buying it if you can. My VX922, which as listed on the site previously mentioned really does only have a 2ms lag, has been superb. Sadly I could go out, find a screen that also claims 2ms, and find it doesn't actually. It's pot luck, really irritating, and along with other parts of monitor descriptions, something I wish Trading Standards and the like would tighten up on.

However, thankfully, for EVE it has little impact. A hiccup by your ISP or heavy upload/download traffic is more likely to have impact.

TamiyaCowboy
Caldari
KRAKEN FLEET
Posted - 2009.05.03 11:50:00 - [93]
 

Edited by: TamiyaCowboy on 03/05/2009 11:51:41
When vertical sync is disabled, a video card is free to render frames as fast as it can, But the display of those rendered frames is still limited to the refresh rate of the monitor. For example, a card may render a game at 100 FPS on a monitor running 75 Hz refresh, but no more than 75 FPS can actually be displayed on screen.

Certain elements of a game may be more GPU-intensive than others. While a game may achieve a fairly consistent 60 frame/s, the frame rate may drop below that during intensive scenes. By achieving frame rates in excess of what is displayable, it makes it less likely that frame rates will drop below what is displayable during heavy CPU/GPU load.

above is the explanation you want.also remember GPU instructs frames too be slightly blured when they change too make it a smooth change, and the human eye can detect these blurs more easy so wiki says

Mr Malaka
Posted - 2009.05.03 13:44:00 - [94]
 

Confirming i'm on page 4 of an idiotic whine that was answered in post #2.

What a ****ing train wreck

Ghoest
Posted - 2009.05.03 16:43:00 - [95]
 

Im guessing it has to do with all the high end cards that were burning up when you used max settings.

Weer Treyt
Posted - 2009.05.03 16:53:00 - [96]
 

Please do the following test:

1.1 Switch your graphics settings to interval immediate.
1.2 In a station take one item on your cursor and drag it left and right over the whole screen, till you get a feeling for the icon lagging behind your movements.
1.3 You realize that the icon is lagging behind only by a tiny amount.

2.1 Switch your graphics settings to interval one (or more).
2.2 Do the same as in 1.2.
2.3 Realize that the icon is lagging behind noticeably more than with interval immediate settings.


Weer Treyt

Agent Known
Posted - 2009.05.03 17:08:00 - [97]
 

Set it to Interval Immediate and leave it be. Burn up your graphics card for all I care. This threadnaught needs to die.

Roy Batty68
Caldari
Immortal Dead
Posted - 2009.05.03 18:26:00 - [98]
 

Originally by: johny B5
before the patch, i had 150 fps and higher with the highest settings

Shocked
Quit hogging all the FPS! Share with the rest of us. Quit being so greedy. Evil or Very Mad



Or you don't get any cake... Wink

Onus Mian
Amarr
Kingfisher Industries
Posted - 2009.05.03 19:27:00 - [99]
 

Edited by: Onus Mian on 03/05/2009 19:31:07
Edited by: Onus Mian on 03/05/2009 19:26:56
Can you eyes even see things changing at faster than 60fps? Its been a while since I did that at school but from what I remember the rate at which pigmants in your eye can be replaced limits how many frames you'd seen in a second regardless of how high the fps was.

EDIT

Nevermind wikipedia says its hard to quantify a human fps

Armoured C
Gallente
Noir.
Noir. Mercenary Group
Posted - 2009.05.03 20:11:00 - [100]
 

i am fine with 45 FPS across 2 screen and can watch and be in a large fleet battle 60FPS is aduqate enough i think

Weight What
Gallente
Posted - 2009.05.03 20:46:00 - [101]
 

Originally by: Armoured C
i am fine with 45 FPS across 2 screen and can watch and be in a large fleet battle 60FPS is aduqate enough i think



Maybe it's enough for you and your primitive eyes, however, those of use with cybernetic ocular implants can process scenes of up to 200FPS.

Armoured C
Gallente
Noir.
Noir. Mercenary Group
Posted - 2009.05.03 20:49:00 - [102]
 

Originally by: Weight What
Originally by: Armoured C
i am fine with 45 FPS across 2 screen and can watch and be in a large fleet battle 60FPS is aduqate enough i think



Maybe it's enough for you and your primitive eyes, however, those of use with cybernetic ocular implants can process scenes of up to 200FPS.


ahh i have cybernetic forumfingertypo version snake plants :)

Weight What
Gallente
Posted - 2009.05.03 20:49:00 - [103]
 

Originally by: Armoured C
ahh i have cybernetic forumfingertypo version snake plants :)




We shall see :))

Armoured C
Gallente
Noir.
Noir. Mercenary Group
Posted - 2009.05.03 20:52:00 - [104]
 

Originally by: Weight What
Originally by: Armoured C
ahh i have cybernetic forumfingertypo version snake plants :)




We shall see :))


you cant beat me ... you cant even come close ... plus i have no work tomorrow and have stayed up with out sleep to insure my forum victory =)

Henry Loenwind
Gallente
Area 42
Posted - 2009.05.03 22:11:00 - [105]
 

O come on people, you are mixing a LOT of things in this discussion.

(1) What can the eye see

(1a) Flicker

Definition: "Flicker" is when the screen goes dark after a frame has been shown.

The eye can see flickering up to very high rates. At about 100-120 Hz the flickering is no longer disturbing, but may still be noticed. CRT monitors flicker at their frame rate, LCDs do NOT flicker at all. CRT-TVs do not flicker, however they pulsate instead (the image doas not go black after each frame but gets darker). Movies flicker, but not at frame rate (24fps) but at doubled (48fps) or trippled (72fps) frame rate---they project the same frame multiple times to avoid the flickering.

(1b) Movement

The eye recognizes movements at about 15 fps. Below it's single images, above it's movement. At higher frame rates the movement will get smoother. How high the frame rate must be for themovement to be absolutely smooth depends on the kind of images that make up the frames. If the images have motion blur (everthing that is recorded with a camera has it), smooth movement start at about 22 fps and gets perfect somewhere between 40 and 50fps. If they don't have motion blur (everything that is created by a computer, unless motion blur is added) much higher frame rates are needed. Some people may be able to detect the effect at 200fps, for others 120 fps might be enough.

Note: Adding motion blur is an expensive process that requires extra (partial) frames to be rendered. We are talking about at least 4-10 frames extra per frame rendered. Guess why those SFX studios need those huge rendering farms for rendering those "simple" 1920*1050px 24fps CGI scenes...

(2) What the monitor can display

(2a) Linked display

CRTs and CRT-TVs are "linked", meaning that the image that is shown is directly linked to the input signal. There is only a small processing delay (1 or 2 scan lines). So if you feed a 59 Hz signal to a CRT, it will display it at 59 Hz. If you give it a 100 Hz signal, it will display it at 100 Hz. If you give it a 200 Hz signal, it will display either nothing or die. Defect CRTs after playing with your display setting were somewhat common 15 years ago...

(2b) non-linked display

LCDs and modern TVs (with image enhancers, e.g. 100Hz-technology) are not linked. They will read the input signal into an internal buffer and then display the image (or more images, or less images, or a changed image) when they "want" to. So a 120Hz-TV will buffer 2 images, then compute a third, then display them. This means there will be a time delay between the input signal and the image on the screen. Some LCDs can have a very large delay, and LCD-TVs usually have a huge delay, too. So if you have a home cinema installation at you notice that the sound from your surround speakers is early, blame the TV for being late with the picture! (Or disable it's image enhancers. Sometime TVs have a "PC mode", that does that.) If you mouse seems to be lagging behind you movements, even on the desktop, try another display.

(2c) black-white change

Early LCDs had the problem that you could see ghost pictures. The reason fo that is that each pixel need some time to change it's color. THAT is the 2ms/5ms value we nowadays see in the techical data. LCD pixels got faster lately, but the main effect is done with image enhancing technology. Basically what is done is that the image sent to the pixel actually has a different brightness then what should be displayed. But due to the slowness of the pixel, it actually is displayed as intended. Again, there's an image enhancer, meaning that we get a delay as in 2b.

(cont'd)

Henry Loenwind
Gallente
Area 42
Posted - 2009.05.03 22:12:00 - [106]
 

(2d) Hz

Non-linked displays (usually) operate at a fixed rate. So a LCD will display 60 frames per second, regardless of the input signal. It may announce to the PC that it can process data at 59, 60 and 75 Hz, but it will display at 60. To my knowledge there is no current consumer LCD that can do 75Hz, though still some display accept 75Hz as input. And with that we come to:

(3) What's on the cable

That's the cable between PC and monitor. In the early days it was a tricky thing to configure the PC so it would send a signal exactly the way the monitor could process it. However, that's the past. Today a monitor will talk back to the PC and tell it exactly what it likes. And with a LCD this would be a 60 Hz signal in 99% of all cases.

(4) What's on the PC

So we know that every 1/60th second one frame must be put on the cable, and that will take about 1/65th second to transfer. So, how does the PC cope with that? There are a couple of different scenarios:

(4a) vsync off, no double buffering

This is the classical way. The GPU renders into a buffer, at fast as it can. At the same time a different part of the GPU reads from that same buffer, pixel by pixel as it sends the date on the line. This is the mode that will (will, not may!) produce "tear lines" by mixing data from different rendered frames into the same frame that is sent to the monitor. But there also is a good side to this: At the moment a picel is sent to the monitor, we know that its maximum age is 1/render-fps seconds. (read: that we get the newest data for that pixel that the GPU can produce)

(4b) vsync on, no double buffering

The GPU renders into a buffer, and when the buffer is filled, it waits. After the buffer is sent to the monitor, the GPU starts to render the next frame. The effect is, that this kind of setup will fail completely---the time when no data is sent to the monitor between frames is so short, that the GPU cannot render a frame in it.

(4c) vsync on, double buffering

Now we have 2 of those bufferers. First the GPU renders a frame into the first buffer. Then it waits until that other part of the GPU starts to send the content of that buffer to the monitor. Then it render the next frame into the second buffer. The other part will continue sending the data from the first buffer to the monitor over and over, until finally the second buffer contains a rendered frame. Then it will switch buffers.

Oops, that only happens if the GPU cannot render fast enough (less than 60fps). If it can, then the content of the buffer will be sent to the monitor only once. But then the rendering part of the GPU must wait until buffers can be swapped.

Sound good, doesn't it? No teatin, no calculating of frames that are never sent to the monitor. But there are 2 problems: First, the GPU might take just a little but longer than 1/60th second to compute a frame. The effect is that you get 30fps---only every second frame is a new one. The second problem happens when the GPU actually is very fast. It may take 1/10th of the allotted 1/60th second the render the frame, then is has to wait 9/10th of 1/60th second for the frame to be sent to the monitor. So that frame sat in the buffer for 9/600th seconds, but if vsync was off, it would only have sat there for about 1/600th second. So we get a slight lag here.

(cont'd)

Henry Loenwind
Gallente
Area 42
Posted - 2009.05.03 22:13:00 - [107]
 

(4d) vsync on, tripple buffering

A third buffer is added---how does that help? This helps agaist both problems noted under 4c. It allows the rendering part of the GPU to continue rendering all the time. There may be a frame in the first buffer that is currently being sent to the monitor, and a frame in the second buffer that has been rendered and is waiting, and the GPU then can render into the third buffer. Now it has rendered the third buffer, and still the first on is being sent (and the second one is getting stale by the nanosecond)...then it will render into the second buffer again, discarding the old image---no problem for the sending part, if it finishes the first buffer, it can continue with the third one.

(5) Details

Yes, I (over-)simplified that all. What did you expect? A scientific paper? <g>


Gariuys
Evil Strangers Inc.
Posted - 2009.05.03 22:17:00 - [108]
 

simplified but nice anyway, good job on explaining the third buffer. ;-D

Henry Loenwind
Gallente
Area 42
Posted - 2009.05.03 22:23:00 - [109]
 

Originally by: Onus Mian
Can you eyes even see things changing at faster than 60fps? Its been a while since I did that at school but from what I remember the rate at which pigmants in your eye can be replaced limits how many frames you'd seen in a second regardless of how high the fps was.


The eye works differently than our technology. Simplified:

A camera let's light fall onto it's pixels, then after a certain time asks each pixel how much light it got.

An eye let's light fall onto it's pixels, and every time a pixel got a certain amount of light it "fires" a signal.

The effect is, that in a high-light setting the eye's time resolution is better than in a low-light setting. If an eye-pixel fires 5 times while it sees one display-frame, we can see the changes that are not part of the frame but of the process of frame-changing. If the eye-pixel fires once for 2 display-frames, we see nice smooth movement (those 2 combined frames even get us motion-blur!).

gamertrav
Trauma Ward
Posted - 2009.05.03 23:23:00 - [110]
 

Nice overview of all the aspects in play here Henry. :)

Originally by: Agent Known
Set it to Interval Immediate and leave it be. Burn up your graphics card for all I care. This threadnaught needs to die.


So you reply the post keeping it at the top of the forums? Nicely done.

Adaris
E X I U S
Posted - 2009.05.04 00:43:00 - [111]
 

I have been told to come to this thread to purchase some additional FPS. thank you.

Astria Tiphareth
Caldari
24th Imperial Crusade
Posted - 2009.05.04 11:58:00 - [112]
 

Originally by: Henry Loenwind
Superb analysis

This should be required reading Smile. Great analysis and explanation. I must confess I'd forgotten some of those details, so the refresher was appreciated. I'd never known about the hardware side, like the LCD having its own buffer.

However, there's one area I think needs clearing up.

In Direct3D one creates a swap chain of buffers to do the buffering as Henry outlined. Direct3D manages the swapping for you in DX9. As it happens, you can't ever directly access the front buffer (which gets sent to the monitor), so D3D forces you to double-buffer as a minimum, whether you like it or not. However, the presentation interval you set determines what happens when you tell Direct3D the back buffer is ready.

Your steps are in essence:
Render everything to the back buffer.
Present that back buffer to the chain.
The driver goes off and swaps the chain. Your front buffer just became your back buffer, and any intervening buffers moved forward one.
The driver presents the new front buffer according to presentation interval.
Note that what you just rendered didn't necessarily get sent to the screen just yet. It might take another present and swap, depending on how many buffers you have.

If you're set to presentation interval immediate, then the driver ignores whatever the monitor is currently doing and sends the data as soon as possible. This is v-sync off as Henry outlined.

If you're set to presentation interval one, then the driver will wait until the monitor reaches a vertical blank. In CRT terms, this is the period when the beam must reset from the bottom corner of your monitor to the top corner to start drawing again.

Intervals two & three and so on merely double or triple the waiting period. So if your refresh rate is 60Hz and your presentation interval is two, it'll present only every two vertical blanks and give you 30 fps. Interval default is effectively interval one, but with some subtleties around timing & windowed vs full screen mode.

The Direct3D call to Present returns as soon as the swap operation being requested has been queued up (i.e. transferring one buffer to another). What does this mean for performance?

What the above actually means is that as you run out of frame buffers to render to, Direct3D will stop Present from returning until there's a new buffer available. Thus your frame rate ends up locked to the ability of Direct3D to manage the swap chain. With immediate presentation, buffers cycle quickly, the card works overtime, and you get maximum FPS. With interval one, Present will shortly end up (within 5 frames or so for certain) locked to the refresh rate.

Thus triple buffering in DirectX is acting purely as a reservoir. What you render as a frame now won't turn up for at least two presents. The advantage that Henry describes where you can render to the second buffer again because it's gone stale doesn't exist in DirectX 9. In DX10 and 11 this may become possible.

This is why v-sync on a Direct3D app leads to lower card temperatures with high performance cards. The card simply isn't working as hard. Both rendering and present cycles are locked to the refresh rate.

If the rendering cycle could operate independently of the present cycle, as with the alternative form of triple buffering, no major temperature difference would be observed.

Short version for DirectX 9 applications like EVE:
V-sync off -> tearing and maximum FPS, maximum card heat, and the opportunity to see some part of the frame a few ms earlier.

V-sync on -> no tearing, clamped FPS, lower card temperatures, at the cost of some ms delay depending on how long the buffering & refresh rate takes.

Arkeladin
Posted - 2009.05.04 12:27:00 - [113]
 

Originally by: Gabriel Loki
Originally by: Lonzo Kincaid
what's the frame rate for human eyes?


They dont have one.


They do, just not in the way people think.

Without getting VERY technical, the human eye can only perceive changes in a given ”scene” at a fixed rate. That rate is EQUIVALENT to about 20 fps. Beyond that, since eyes are a organic system and don't have anything like a shutter system, the frames start bleeding together. This is called ”persistence of vision” and is what allows us to perceive motion. Abusing this somewhat is how movies and TV works - TV *FRAMES* per second can be as low as 28 yet still appear smooth (PAL TV).

Look it up yourselves

Pan Crastus
Anti-Metagaming League
Posted - 2009.05.04 12:51:00 - [114]
 

Originally by: Weer Treyt
Please do the following test:

1.1 Switch your graphics settings to interval immediate.
1.2 In a station take one item on your cursor and drag it left and right over the whole screen, till you get a feeling for the icon lagging behind your movements.
1.3 You realize that the icon is lagging behind only by a tiny amount.

2.1 Switch your graphics settings to interval one (or more).
2.2 Do the same as in 1.2.
2.3 Realize that the icon is lagging behind noticeably more than with interval immediate settings.




Interesting test, but it doesn't show a higher refresh rate. What it shows is that EVE is rendering 1 or more (seems like more) frames in advance, so the frames you see are 2-3 60ths of a second behind the hardware mouse cursor. I don't know why it is so clearly visible, but it seems to suggest that EVE's double/triple-buffering could be improved...

Astria Tiphareth
Caldari
24th Imperial Crusade
Posted - 2009.05.04 13:12:00 - [115]
 

Edited by: Astria Tiphareth on 04/05/2009 13:13:09
Originally by: Pan Crastus
Interesting test, but it doesn't show a higher refresh rate. What it shows is that EVE is rendering 1 or more (seems like more) frames in advance, so the frames you see are 2-3 60ths of a second behind the hardware mouse cursor. I don't know why it is so clearly visible, but it seems to suggest that EVE's double/triple-buffering could be improved...

It also fails to take into account everything else that happens as you drag the icon around. EVE is entirely single-threaded, so the immediate presentation allows for a faster response time on other code e.g. network-related queries or detecting mouse events.

As I said earlier, the critical issue that people seem to forget is that beyond potentially a very slightly more responsive GUI, and a few milliseconds more warning of a visual change that is entirely client-extrapolated anyway, maximum FPS or clamped FPS makes little difference to EVE's gameplay. It's all focused around a full 1 second update cycle.

For now, immediate mode is useful for those that want every performance aspect pushed as far to the limit as possible. For those of us that want our graphics card to not run at 100% all the time, interval one is far more of an acceptable compromise. The real test is this - play EVE, fight, get in a fleet fight, with v-sync off. Do the same with v-sync on. Has it made any real difference to the game, and your success with it etc.? That is the critical question to ask, and it's a personal one unique to each of us & our environment.

Taedrin
Gallente
Kushan Industrial
Posted - 2009.05.04 13:23:00 - [116]
 

Edited by: Taedrin on 04/05/2009 13:26:35
Originally by: Akita T
Edited by: Akita T on 17/04/2009 16:58:02

Turning VSynch off just so you can SHOW OFF your "omfg, 279 FPS EVE, wtfbbq" is downright stupid, because it's KNOWN to have contributed a lot to premature frying of several video cards.

Also, like many, MANY people have said here before, if the monitor can only display 60 (or 75, or 100, or 120, whatever), there's no BENEFIT in turning VSynch off ("Interval immediate") other than the FPS e-p33n number on the FPS monitor. At best, you will see half a frame and half of the other with a bit of tearning in-between, but that's just stupid.



In EVE, at pretty much all times, VSynch should be turned on (Interval one) and left that way forever.
It has no serious drawbacks (some MIGHT argue that it doesn't "feel that dynamic anymore" because they were used to the tearing effect) but a lot of benefits compared to the alternative (longer vidcard life, for starters).



NOT ALWAYS TRUE.

Vsynch keeps the graphics card synched with the monitor, and this is all well and good if the graphics card is fast enough to "keep up" with the monitor. But if your graphics card is rendering an intensive scene, vsynch has a couple of consequences. First off - vsynch will NOT start updating a scene on the monitor until the monitor is finished drawing the last frame. This means that if you have vsynch enabled, your frame rate will either be: 4fps, 7-8fps,15fps, 30fps, 60fps. Please note that the FPS being displayed by EVE, FRAPS or what not is actually an AVERAGE frame rate. Not the actual current frame rate, so it will display different numbers.

Here's an example that I've posted on these forums before:

"
In this example, your FPS is 5/6 of the refresh rate.

Interval immediate:
f1 f2 f3 f4 f5 f6
1111 1111 2222 3333 4444 6666
1111 2222 2222 3333 4444 6666
1111 2222 3333 3333 4444 6666
1111 2222 3333 4444 4444 6666

Interval one:
f1 f2 f3 f4 f5 f6
1111 1111 3333 3333 5555 6666
1111 1111 3333 3333 5555 6666
1111 1111 3333 3333 5555 6666
1111 1111 3333 3333 5555 6666


You see here that Interval immediate only drops frame #5, but is only able to draw a portion of frames 2-4 on time. Interval one drops frames #2 and #4 so it can get a head start on Frames #3 and #5 (so it actually finishes #5 on time) However, more frames were dropped under Interval One than interval immediate.
"

EDIT: You will see that while Interval immediate does more work, it introduces an artifact called "tearing" into the rendering process. Vsynch will cause your effective frame rate to drop, but will eliminate "tearing".

Chribba
Otherworld Enterprises
Otherworld Empire
Posted - 2009.05.04 13:27:00 - [117]
 

So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?

Astria Tiphareth
Caldari
24th Imperial Crusade
Posted - 2009.05.04 13:35:00 - [118]
 

Edited by: Astria Tiphareth on 04/05/2009 13:37:45
Originally by: Chribba
So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?

Interval one is v-sync; it's just a different set of naming conventions. It will lock to the refresh rate of your monitor, so that your maximum FPS never goes above it. Empirical evidence and a fair amount of logic indicates that your card will run cooler as a result. How much impact that has on the lifetime of your card will depend Very Happy.

Edit: You could put it lower to two or three or four, if you want, but max locked FPS will halve each time, and I'd personally not recommend three or four.

Taedrin
Gallente
Kushan Industrial
Posted - 2009.05.04 13:37:00 - [119]
 

Originally by: Chribba
So, in order for LEAST wear&tear (heat) - vsync+interval one is the way to go? I'm fine with 20fps if that means my GPU sticks at 50C rather than 85 I'm all happy. What to do?


Yes. Interval Immediate tells the graphics card to do as much work as it possibly can, while interval one (vsynch) allows it to take a break between frame refreshes. BTW, I think we've actually already gone over this in this thread Laughing

Seishi Maru
doMAL S.A.
Posted - 2009.05.04 15:19:00 - [120]
 

Edited by: Seishi Maru on 04/05/2009 15:23:17
Originally by: Shintai
Originally by: Grez
Idd, the last two posts have hit it on the head.

Human eyes can perceive up to, and just about past 200fps. Some people cannot, some people can - it's a bit like hearing range for eyes, but, well yeah, you get the idea...

Locking your fps to your monitors refresh rate can also have detrimental effects. Lookup vsync and what it does.

All I did was state that not being able to see past 60fps is rubbish, and not a reason to lock your computer to a certain fps. It's also not a reason to use vsync. Vsync should only be used if you experience tearing of textures/scenes (the issue of a frame being dropped halfway through being rendered).


First of all the human eye dont know if its 30 FPS or 2mio FPS. Does the screen flicker for you in the cinema?

Secondly..LCDs...60hz...60FPS..Bingo. anything above FPS simply wont get shown. the frames are discarded. !


That is because movies to not use an aditive composite coloring scheme. They use filtering color scheme on a film that causes slight blurrign when one frame passes to the other. That diminishes greatly the ammount of frame per secodns neeed to deceive human brain. But on a monitor you need quite a bit more to achieve same effect. Usually for most people around 50 fps is enough. Remember that altoguheach receptor in your eye can operate in much lower frequency they are NOT syncronized. You can have one receptor switch on or off every n millisecconds only but N others with same frequency be offseted a little bit and be triggered between those n millisseconds. That is why You CAN notice that there is somethign wrong with scene on some situations at frame rates liek 20-25 fps...


Also when you unlock Vsycn you are just creatign problems sicne the CPU msut calculate all th trash (Trash becuse won't be seen) that msut be sent to GPU. On that scenario you are just overstressing your CPU and makign the rest of the softwares on your computer have a harder time if they are runnignin background.


Pages: 1 2 3 [4] 5

This thread is older than 90 days and has been locked due to inactivity.


 


The new forums are live

Please adjust your bookmarks to https://forums.eveonline.com

These forums are archived and read-only