maximum frame rates?
samerkablamer
hey all, this questions isnt actually specifically GW related but its the technician's corner so i thought i could ask it anyways...
most gamers CARE about framerates being as high as possible, with many people not even being satisfied with 60. i have some questions about this...
first of all, most people cannot tell the difference when looking at something at 60fps, or something more. it looks the same. why do these superb gamer enthusiasts seem to notice?
secondly, isnt there a limit to how many fps a standard lcd monitor can deliver? some people talk about going 110fps on many games, but can a lcd monitor with a refresh rate of 60 or 75Hz even display that many frames?
most gamers CARE about framerates being as high as possible, with many people not even being satisfied with 60. i have some questions about this...
first of all, most people cannot tell the difference when looking at something at 60fps, or something more. it looks the same. why do these superb gamer enthusiasts seem to notice?
secondly, isnt there a limit to how many fps a standard lcd monitor can deliver? some people talk about going 110fps on many games, but can a lcd monitor with a refresh rate of 60 or 75Hz even display that many frames?
Lord Sojar
To answer your first question:
The human eye can technically see an unlimited amount of FPS... but, for the sake of argument, the upper cap is around 4000. That said, it doesn't mean you need 4000 to make things look normal. 4000+ is the limit of cone and rod stimulus to detect changes in light. Motion blur is the result of our eyes not being able to catch up with a fast moving object.
However... anything above 60 FPS isn't noticeably different to the human eye, because of motion blur. Many modern games have added in motion blur, making the need for anything above 60+ FPS irrelevant in most cases. 100+ FPS is fairly smooth to our eyes, yes, but without motion blur happening, it still looks "unnatural" to us.
You can chalk up the craving for 100+ FPS to e-peen inflation and just desiring big numbers to prove your system is amazing, etc.
As for the second question:
Yes, LCDs are bound by their refresh rate. A 60Hz monitor receives no benefit from anything above 60FPS, same with a 75Hz monitor in regards to 75+FPS. If anything, picture quality can be diminished without enabling VSYNC on systems capable of maintaining solid FPS ratings of 60/75+ (varied by monitor of course).
Does that help?
The human eye can technically see an unlimited amount of FPS... but, for the sake of argument, the upper cap is around 4000. That said, it doesn't mean you need 4000 to make things look normal. 4000+ is the limit of cone and rod stimulus to detect changes in light. Motion blur is the result of our eyes not being able to catch up with a fast moving object.
However... anything above 60 FPS isn't noticeably different to the human eye, because of motion blur. Many modern games have added in motion blur, making the need for anything above 60+ FPS irrelevant in most cases. 100+ FPS is fairly smooth to our eyes, yes, but without motion blur happening, it still looks "unnatural" to us.
You can chalk up the craving for 100+ FPS to e-peen inflation and just desiring big numbers to prove your system is amazing, etc.
As for the second question:
Yes, LCDs are bound by their refresh rate. A 60Hz monitor receives no benefit from anything above 60FPS, same with a 75Hz monitor in regards to 75+FPS. If anything, picture quality can be diminished without enabling VSYNC on systems capable of maintaining solid FPS ratings of 60/75+ (varied by monitor of course).
Does that help?
samerkablamer
hmm thanks alot, exactly what i was looking for. i always thought that it doesnt make sense for the frame-rate to go higher than the refresh rate but I usually dont know what im talking about so i kept my mouth shut about it =P
subarucar
I ran the -dx8, -noshaders and -something else for testing purposes. My framerate went up. I'm not sure if this is GW little reader thing being wrong, or if it is a way to get the framerate up. However if your trying to get a high framerate, chances are you dont want low graphics settings.
tw1tchdp
Quote:
To answer your first question:
The human eye can technically see an unlimited amount of FPS... but, for the sake of argument, the upper cap is around 4000. That said, it doesn't mean you need 4000 to make things look normal. 4000+ is the limit of cone and rod stimulus to detect changes in light. Motion blur is the result of our eyes not being able to catch up with a fast moving object. |
mathiastemplar
Some good articles on wikipedia about this:P
http://en.wikipedia.org/wiki/Framerate
http://en.wikipedia.org/wiki/Wagon-wheel_effect
But in general, anything above 24fps should be acceptable.
glhf =)
http://en.wikipedia.org/wiki/Framerate
http://en.wikipedia.org/wiki/Wagon-wheel_effect
But in general, anything above 24fps should be acceptable.
glhf =)
moriz
24fps is most certainly NOT acceptable while gaming. it is only acceptable in movies with extensive motion blurring and being seated in darkness.
Aera
It's just the videocard sending that many frames but the monitor just displays whatever it can.
Notorious Bob
Quote:
You can chalk up the craving for 100+ FPS to e-peen inflation and just desiring big numbers to prove your system is amazing, etc.
As for the second question: Yes, LCDs are bound by their refresh rate. A 60Hz monitor receives no benefit from anything above 60FPS, same with a 75Hz monitor in regards to 75+FPS. If anything, picture quality can be diminished without enabling VSYNC on systems capable of maintaining solid FPS ratings of 60/75+ (varied by monitor of course). Does that help? |
Often you'll hear the phrase 'flicker free' i.e. where FPS = refresh rate and this was a lot more obvious in the days of interlaced CRT monitors.
The only thing you really do by having your GPU bashing out frames way in excess of the monitors refresh rate is increase the heat output of the GPU. Quite an expensive way to heat your gaming area! :O
Elder III
my opinion on what "acceptable" framerates are is dependent on the game I'm playing. For example, a fast paced action game - a Shooter like Call of Duty 5 for example - is much smoother and better if you can get that 60fps, although 50 is adequate. A less fast paced game, with fewer characters running around - ie an Action RPG like Oblivion - is fine if you stay in the 40+ range. Some RTS games are very smooth at 30+ FPS, although ideally I like to stay at 45 or more FPS in all my games. Generally speaking the more important consideration when talking about "acceptable" FPS is the "lowest fps" - that is what is the lowest # - Oblivion is a notorious example of that - it has very unstable fps that might spike from 60+ to -20 in some areas - thereby necessitating a more powerful machine, mostly to keep the spikes from dropping too low. (there are some good Oblivion mods that help low to mid range systems btw). Anyways, enough talk about one of my fave games. I guess the point of my Saturday morning ramble is to say that I like to max my games out and maintain 45 FPS or higher. If I can get that much I'm satisfied, if not I start thinking about new hardware. Now to finish my coffee and thereby hopefully making my next post more coherent.
Lord Sojar
Quote:
Originally Posted by tw1tchdp
i think more around 100 ^^
|
Quote:
Originally Posted by mathiastemplar
Some good articles on wikipedia about this:P
http://en.wikipedia.org/wiki/Framerate http://en.wikipedia.org/wiki/Wagon-wheel_effect But in general, anything above 24fps should be acceptable. glhf =) |
Quaker
Quote:
hey all, this questions isnt actually specifically GW related but its the technician's corner so i thought i could ask it anyways...
|
Quote:
most gamers CARE about framerates being as high as possible, with many people not even being satisfied with 60. |
Also, for testing purposes, running fps as high as possible is a way to compare one system/component to another.
Quote:
why do these superb gamer enthusiasts seem to notice? |
Quote:
secondly, isnt there a limit to how many fps a standard lcd monitor can deliver? some people talk about going 110fps on many games, but can a lcd monitor with a refresh rate of 60 or 75Hz even display that many frames? |
If the game is running at more than 60 fps, the "frame buffer's" contents are being updated more than 60 times a second, but the buffer is still only sent to the monitor at 60fps. Depending upon the design of the frame buffer, this can mean that the contents of the frame buffer can change while the frame is being sent to monitor. This can result in video artifacts such as "tearing" where part of the picture seems offset from the rest. But, in any case, the LCD itself is not being updated any faster than 60 times per second (or whatever it's native vertical refresh rate is).
Basically, when gaming, there is no purpose in turning off vsync and running GW at more than 60fps. It just causes the video card to work harder, creating more heat, and makes the fan run faster/louder.
blackmage_z
Quote:
Some people who run FPS (First Person Shooter) type games (and GW PvP) swear that they can play and react better with high fps. Who knows? When I play TFC I don't notice any difference above 60 fps.
|
Here is an interesting article on controller latency for those interested:
http://www.eurogamer.net/articles/di...factor-article
Lord Sojar
Quote:
Basically, when gaming, there is no purpose in turning off vsync and running GW at more than 60fps. It just causes the video card to work harder, creating more heat, and makes the fan run faster/louder.
|
However, it was fun running a G300 card at 1,200+ FPS in GW @1920x1200 16xQ MSAA, 24xAF, and triple buffered. LOL. :P
moriz
screenshot or it never happened
but yeah, vsync in GW tends to make the camera somewhat laggy. i generally play with it turned off to get a more responsive camera rotation.
but yeah, vsync in GW tends to make the camera somewhat laggy. i generally play with it turned off to get a more responsive camera rotation.
Blackhearted
Quote:
However... anything above 60 FPS isn't noticeably different to the human eye, because of motion blur. Many modern games have added in motion blur, making the need for anything above 60+ FPS irrelevant in most cases. 100+ FPS is fairly smooth to our eyes, yes, but without motion blur happening, it still looks "unnatural" to us.
|
As Elder III said, that depends on the game. In sloooooowww paced games like GW and most other mmo's, 24-30 fps is quite useable.
moriz
GW is as fast paced (or as slow paced) as you want it to be. with my playstyle, anything less than 60 fps WILL cut into my performance. it depends on the build actually; ranger is affected greatly, simpler builds like the typical assassin builds are not impacted at all.
Quaker
Quote:
I believe higher frame rates help to reduce input latency as the quicker the screen refreshes the faster the change is displayed on screen. But we are talking milliseconds difference.
|
As your ping goes up, your server-updates-per-second would go down. It may be fun to run GW at 200 fps, but it would have no affect on the game unless your ping is very very low (sub 50ms) AND the server also runs at 200 fps.
99% of the apparent increase in performance above 60Hz can basically be attributed to the "placebo effect". (Maybe, wearing a copper bracelet while you play can boost your speed AND reduce arthritis pain. )
As a side note: many years ago I ran a TFC (Team Fortress Classic) server. Same as now, many of the players swore they could play better and react faster with high fps rates. But, being the server admin, I could see the actual "fps" of the server. Depending upon the number of players, the server ran between 75 and 90 "fps" (not really frames-per-second as there was no video on the server - more properly program-loops-per-second). So it didn't matter what the players thought they were getting, the server would only update 90 times per second (max) anyway.
It would be interesting to discover the actual "fps" rate of a typical GW server.
Rushin Roulette
Quote:
GW is as fast paced (or as slow paced) as you want it to be. with my playstyle, anything less than 60 fps WILL cut into my performance. it depends on the build actually; ranger is affected greatly, simpler builds like the typical assassin builds are not impacted at all.
|
I play HL2: DM (Used to play and run tournaments and leagues as well), and GW is reeeealy slow comparatively. in GW, I've never seen 12 + players bouncing around like Kangaroos on a sugar shock after receiving a double intravenous dose of Speed.
At a certain level you are just at the maximum of your own reflex timing.
Most top level FPS players don't actually need the Frames/S boost from lowering the Graphics options to below minimum, they just don't want to be distracted by random patterns in the background that might look like an opponent out of the corners of their eyes.
Ouch
Hah I have to put up with 9 FPS for the last 3 years and all I do is PvP, it's hard.
riktw
Quote:
However, it was fun running a G300 card at 1,200+ FPS in GW @1920x1200 16xQ MSAA, 24xAF, and triple buffered. LOL. :P
|
still, 1200FPS, damn me wants crysis benchmarks.
getting 700+ here with my HD4870*2, fun to impress others.
anyways, for GW, 60FPS is more then enough, for shooters and other fast games 100 can be nice, if your screen supports that.
most oldschool CRT's of good quality can go up 120 refresh rate.
so get an old 22" CRT on ebay, and gogogo 120FPS with vsync on.
Lord Sojar
Quote:
guessing 16msaa is forced in nvidia control panel.
still, 1200FPS, damn me wants crysis benchmarks. getting 700+ here with my HD4870*2, fun to impress others. anyways, for GW, 60FPS is more then enough, for shooters and other fast games 100 can be nice, if your screen supports that. most oldschool CRT's of good quality can go up 120 refresh rate. so get an old 22" CRT on ebay, and gogogo 120FPS with vsync on. |