maximum frame rates?

1 pages Page 1
s
samerkablamer
Frost Gate Guardian
#1
hey all, this questions isnt actually specifically GW related but its the technician's corner so i thought i could ask it anyways...

most gamers CARE about framerates being as high as possible, with many people not even being satisfied with 60. i have some questions about this...

first of all, most people cannot tell the difference when looking at something at 60fps, or something more. it looks the same. why do these superb gamer enthusiasts seem to notice?

secondly, isnt there a limit to how many fps a standard lcd monitor can deliver? some people talk about going 110fps on many games, but can a lcd monitor with a refresh rate of 60 or 75Hz even display that many frames?
Lord Sojar
Lord Sojar
The Fallen One
#2
To answer your first question:

The human eye can technically see an unlimited amount of FPS... but, for the sake of argument, the upper cap is around 4000. That said, it doesn't mean you need 4000 to make things look normal. 4000+ is the limit of cone and rod stimulus to detect changes in light. Motion blur is the result of our eyes not being able to catch up with a fast moving object.

However... anything above 60 FPS isn't noticeably different to the human eye, because of motion blur. Many modern games have added in motion blur, making the need for anything above 60+ FPS irrelevant in most cases. 100+ FPS is fairly smooth to our eyes, yes, but without motion blur happening, it still looks "unnatural" to us.

You can chalk up the craving for 100+ FPS to e-peen inflation and just desiring big numbers to prove your system is amazing, etc.


As for the second question:

Yes, LCDs are bound by their refresh rate. A 60Hz monitor receives no benefit from anything above 60FPS, same with a 75Hz monitor in regards to 75+FPS. If anything, picture quality can be diminished without enabling VSYNC on systems capable of maintaining solid FPS ratings of 60/75+ (varied by monitor of course).

Does that help?
s
samerkablamer
Frost Gate Guardian
#3
hmm thanks alot, exactly what i was looking for. i always thought that it doesnt make sense for the frame-rate to go higher than the refresh rate but I usually dont know what im talking about so i kept my mouth shut about it =P
subarucar
subarucar
Desert Nomad
#4
I ran the -dx8, -noshaders and -something else for testing purposes. My framerate went up. I'm not sure if this is GW little reader thing being wrong, or if it is a way to get the framerate up. However if your trying to get a high framerate, chances are you dont want low graphics settings.
t
tw1tchdp
Lion's Arch Merchant
#5
Quote:
Originally Posted by Rahja the Thief View Post
To answer your first question:

The human eye can technically see an unlimited amount of FPS... but, for the sake of argument, the upper cap is around 4000. That said, it doesn't mean you need 4000 to make things look normal. 4000+ is the limit of cone and rod stimulus to detect changes in light. Motion blur is the result of our eyes not being able to catch up with a fast moving object.
i think more around 100 ^^
moriz
moriz
??ber t??k-n??sh'??n
#7
24fps is most certainly NOT acceptable while gaming. it is only acceptable in movies with extensive motion blurring and being seated in darkness.
Aera
Aera
Forge Runner
#8
Quote:
Originally Posted by samerkablamer View Post
hmm thanks alot, exactly what i was looking for. i always thought that it doesnt make sense for the frame-rate to go higher than the refresh rate but I usually dont know what im talking about so i kept my mouth shut about it =P
It's just the videocard sending that many frames but the monitor just displays whatever it can.
Notorious Bob
Notorious Bob
Frost Gate Guardian
#9
Quote:
Originally Posted by Rahja the Thief View Post
You can chalk up the craving for 100+ FPS to e-peen inflation and just desiring big numbers to prove your system is amazing, etc.


As for the second question:

Yes, LCDs are bound by their refresh rate. A 60Hz monitor receives no benefit from anything above 60FPS, same with a 75Hz monitor in regards to 75+FPS. If anything, picture quality can be diminished without enabling VSYNC on systems capable of maintaining solid FPS ratings of 60/75+ (varied by monitor of course).

Does that help?
QFT!

Often you'll hear the phrase 'flicker free' i.e. where FPS = refresh rate and this was a lot more obvious in the days of interlaced CRT monitors.

The only thing you really do by having your GPU bashing out frames way in excess of the monitors refresh rate is increase the heat output of the GPU. Quite an expensive way to heat your gaming area! :O
Elder III
Elder III
Furnace Stoker
#10
my opinion on what "acceptable" framerates are is dependent on the game I'm playing. For example, a fast paced action game - a Shooter like Call of Duty 5 for example - is much smoother and better if you can get that 60fps, although 50 is adequate. A less fast paced game, with fewer characters running around - ie an Action RPG like Oblivion - is fine if you stay in the 40+ range. Some RTS games are very smooth at 30+ FPS, although ideally I like to stay at 45 or more FPS in all my games. Generally speaking the more important consideration when talking about "acceptable" FPS is the "lowest fps" - that is what is the lowest # - Oblivion is a notorious example of that - it has very unstable fps that might spike from 60+ to -20 in some areas - thereby necessitating a more powerful machine, mostly to keep the spikes from dropping too low. (there are some good Oblivion mods that help low to mid range systems btw). Anyways, enough talk about one of my fave games. I guess the point of my Saturday morning ramble is to say that I like to max my games out and maintain 45 FPS or higher. If I can get that much I'm satisfied, if not I start thinking about new hardware. Now to finish my coffee and thereby hopefully making my next post more coherent.
Lord Sojar
Lord Sojar
The Fallen One
#11
Quote:
Originally Posted by tw1tchdp
i think more around 100 ^^
Quote:
Originally Posted by mathiastemplar
Some good articles on wikipedia about this:P
http://en.wikipedia.org/wiki/Framerate
http://en.wikipedia.org/wiki/Wagon-wheel_effect
But in general, anything above 24fps should be acceptable.
glhf =)
My post is based on the rules of optical physics and human biology, not ideals. It is fact by mathematical standards. Take it as you will, but math can't lie.
Quaker
Quaker
Hell's Protector
#12
Quote:
Originally Posted by samerkablamer View Post
hey all, this questions isnt actually specifically GW related but its the technician's corner so i thought i could ask it anyways...
As far as I know, this forum is for general tech questions/problems posed by the users of the forum. The questions/problems do not need to be GW related.

Quote:
most gamers CARE about framerates being as high as possible, with many people not even being satisfied with 60.
I don't think this applies to "most gamers", but without doing a survey, it's hard to tell. (It doesn't apply to me.) It's more often the computer hobbyists who are into having the biggest, fastest computer, who care about max framerates - especially those who are gamers as well.
Also, for testing purposes, running fps as high as possible is a way to compare one system/component to another.

Quote:
why do these superb gamer enthusiasts seem to notice?
Some people who run FPS (First Person Shooter) type games (and GW PvP) swear that they can play and react better with high fps. Who knows? When I play TFC I don't notice any difference above 60 fps.

Quote:
secondly, isnt there a limit to how many fps a standard lcd monitor can deliver? some people talk about going 110fps on many games, but can a lcd monitor with a refresh rate of 60 or 75Hz even display that many frames?
No, it can't. In a typical video card there is a thing called a "frame buffer". When the game (or app) puts together a "frame", it is stored in the frame buffer. Every 1/60th sec (60 fps) the contents of the frame buffer are sent to the monitor. This is hardware determined by the monitor itself - LCD monitors can't display more frames than they are designed to. (There are, of course, some LCDs that run at 75Hz and some HDTVs that can run at 120Hz)
If the game is running at more than 60 fps, the "frame buffer's" contents are being updated more than 60 times a second, but the buffer is still only sent to the monitor at 60fps. Depending upon the design of the frame buffer, this can mean that the contents of the frame buffer can change while the frame is being sent to monitor. This can result in video artifacts such as "tearing" where part of the picture seems offset from the rest. But, in any case, the LCD itself is not being updated any faster than 60 times per second (or whatever it's native vertical refresh rate is).

Basically, when gaming, there is no purpose in turning off vsync and running GW at more than 60fps. It just causes the video card to work harder, creating more heat, and makes the fan run faster/louder.
b
blackmage_z
Ascalonian Squire
#13
Quote:
Originally Posted by Quaker View Post
Some people who run FPS (First Person Shooter) type games (and GW PvP) swear that they can play and react better with high fps. Who knows? When I play TFC I don't notice any difference above 60 fps.
I believe higher frame rates help to reduce input latency as the quicker the screen refreshes the faster the change is displayed on screen. But we are talking milliseconds difference. I suppose this would be useful to people who are pro players, but the standard user, meh.

Here is an interesting article on controller latency for those interested:
http://www.eurogamer.net/articles/di...factor-article
Lord Sojar
Lord Sojar
The Fallen One
#14
Quote:
Originally Posted by Quaker View Post
Basically, when gaming, there is no purpose in turning off vsync and running GW at more than 60fps. It just causes the video card to work harder, creating more heat, and makes the fan run faster/louder.

However, it was fun running a G300 card at 1,200+ FPS in GW @1920x1200 16xQ MSAA, 24xAF, and triple buffered. LOL. :P
moriz
moriz
??ber t??k-n??sh'??n
#15
screenshot or it never happened

but yeah, vsync in GW tends to make the camera somewhat laggy. i generally play with it turned off to get a more responsive camera rotation.
Blackhearted
Blackhearted
Krytan Explorer
#16
Quote:
Originally Posted by Rahja the Thief View Post
However... anything above 60 FPS isn't noticeably different to the human eye, because of motion blur. Many modern games have added in motion blur, making the need for anything above 60+ FPS irrelevant in most cases. 100+ FPS is fairly smooth to our eyes, yes, but without motion blur happening, it still looks "unnatural" to us.
Which is one of the worst things to happen to graphics in games. In most of those games that implement motion blur, it typically ends up being nothing but an overused, and oftentimes unnecessary, eyesore that needs to be disabled within 5-10 mins. Few games ever manage to get it right.


Quote:
Originally Posted by moriz View Post
24fps is most certainly NOT acceptable while gaming. it is only acceptable in movies with extensive motion blurring and being seated in darkness.
As Elder III said, that depends on the game. In sloooooowww paced games like GW and most other mmo's, 24-30 fps is quite useable.
moriz
moriz
??ber t??k-n??sh'??n
#17
GW is as fast paced (or as slow paced) as you want it to be. with my playstyle, anything less than 60 fps WILL cut into my performance. it depends on the build actually; ranger is affected greatly, simpler builds like the typical assassin builds are not impacted at all.
Quaker
Quaker
Hell's Protector
#18
Quote:
Originally Posted by blackmage_z View Post
I believe higher frame rates help to reduce input latency as the quicker the screen refreshes the faster the change is displayed on screen. But we are talking milliseconds difference.
Another factor to take into account is how often the client (you) can update the server. For example, if your "ping" is 100ms, that's .01 seconds, or 1/100 of a second. That means that, at most, you could get the server to update your position no more than 100 times per second, and there must be some overhead in that (the time it takes to process the info), so your server update rate is even less than that.
As your ping goes up, your server-updates-per-second would go down. It may be fun to run GW at 200 fps, but it would have no affect on the game unless your ping is very very low (sub 50ms) AND the server also runs at 200 fps.

99% of the apparent increase in performance above 60Hz can basically be attributed to the "placebo effect". (Maybe, wearing a copper bracelet while you play can boost your speed AND reduce arthritis pain. )

As a side note: many years ago I ran a TFC (Team Fortress Classic) server. Same as now, many of the players swore they could play better and react faster with high fps rates. But, being the server admin, I could see the actual "fps" of the server. Depending upon the number of players, the server ran between 75 and 90 "fps" (not really frames-per-second as there was no video on the server - more properly program-loops-per-second). So it didn't matter what the players thought they were getting, the server would only update 90 times per second (max) anyway.
It would be interesting to discover the actual "fps" rate of a typical GW server.
Rushin Roulette
Rushin Roulette
Forge Runner
#19
Quote:
Originally Posted by moriz View Post
GW is as fast paced (or as slow paced) as you want it to be. with my playstyle, anything less than 60 fps WILL cut into my performance. it depends on the build actually; ranger is affected greatly, simpler builds like the typical assassin builds are not impacted at all.
Believe me, GW (Even High level PvP) is very slow paced compared with most FPS games, and even in those games it doesn't really make much of a difference if you are playing at 60, 75 FPS or 120 FPS with the right monitor.

I play HL2: DM (Used to play and run tournaments and leagues as well), and GW is reeeealy slow comparatively. in GW, I've never seen 12 + players bouncing around like Kangaroos on a sugar shock after receiving a double intravenous dose of Speed.

At a certain level you are just at the maximum of your own reflex timing.

Most top level FPS players don't actually need the Frames/S boost from lowering the Graphics options to below minimum, they just don't want to be distracted by random patterns in the background that might look like an opponent out of the corners of their eyes.
O
Ouch
Ascalonian Squire
#20
Hah I have to put up with 9 FPS for the last 3 years and all I do is PvP, it's hard.