Going beyond 60 FPS

Garion

Ascalonian Squire

Join Date: Jul 2005

Madrid, Spain

Los Resolutores (RES)

W/Me

Hi,

I use a DELL XPS Inspiron laptop with a nVidia Geoforce Go 6800 Ultra card.

Despite any resolution I use, from 1024 to 1920, the game tops at 60 FPS.

I see that the card can go beyond 60 FPS but somehow is stuck there. Do you know if 60 FPS is guild wars maximum FPS? is it a driver issue? if so can you give me a link to update my geoforce go 6800 ultra drivers? Is there any command or parameter I should add to go beyond 60 FPS?

Thanks in advance!!!

neoteo

neoteo

Banned

Join Date: Jun 2005

Macau

www.exilesofdarksteel.com

E/

may i ask where you see the FPS in guildwars ?

Numa Pompilius

Numa Pompilius

Grotto Attendant

Join Date: May 2005

At an Insit.. Intis... a house.

Live Forever Or Die Trying [GLHF]

W/Me

Quote:
Originally Posted by Garion
Hi,

I use a DELL XPS Inspiron laptop with a nVidia Geoforce Go 6800 Ultra card.

Despite any resolution I use, from 1024 to 1920, the game tops at 60 FPS.
You probably have vsync enabled.

What you do, is open Control Panel/Display/Settings/Advanced/Geforce 6800 ultra/Peformance and quality settings/
There you find "Vertical sync" in the list, and set it to 'off'.

Hockster

Hockster

Banned

Join Date: Jul 2005

Ditto on the v-sync advice.

To see stats you can modify the shortcut that starts GW. Right click it, properties, and in the target box add this to teh end: -perf Make sure to have a space after the closing quotation mark and have the hyphon in there. Start the game and at the top will be a set of statistics.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

60FPS max is a good thing for your laptop display. Faster frame rates actually wear out LCD pixels at a much faster rate. This is true for all current laptop and flat pannel LCD displays. This is also why DVI connectors top out at 60hz vertical refresh rates.

Besides, flicker is not a problem for LCD's, since their pixels have constant luminance until the image changes, while CRT pixels rapidly fade after each strobe (hence bad flicker until 72+ hz).

Garion

Ascalonian Squire

Join Date: Jul 2005

Madrid, Spain

Los Resolutores (RES)

W/Me

Man, you rock!! Thanx for the fast advice. I'll check it out asap.

So I understand that FPS cannot exceed the monitors actual refresh rate, can it? So even though the XPX geoforce 6800 card can deliver 100 fps, if the laptop screen cannot go over 60hz I wont be able to enjoy 100 FPS

Yes, I use -perf to see the stats. As I understand it, it shows the number of polygons, or sth like that, the FPS and data transfer rate between the computer and the game server.

It is simple, small and straightforward information. Since I discovered it I do not play without it.

Thanx again to yall!!

Hockster

Hockster

Banned

Join Date: Jul 2005

Disabling the v-sync will allow more than 60FPS. V-sync merely locks the maximum FPS to match the refresh rate of the monitor.

Garion

Ascalonian Squire

Join Date: Jul 2005

Madrid, Spain

Los Resolutores (RES)

W/Me

And will that actually damage the LCD panel?

It is really not a prob as I got this XPS as a gift from DELL but I wanna know if switching off vsync and letting the card channel 90 FPS on the LCD panel will actually set it aflame or something lol

Willy Rockwell

Willy Rockwell

Lion's Arch Merchant

Join Date: May 2005

What makes you think you would even see frame rates higher than 60? Television is 30 fps with interlaced fields, which technically is 60 fps. Movies are 24 fps. Why would you want a higher frame rate than television?

Hockster

Hockster

Banned

Join Date: Jul 2005

Quote:
Originally Posted by Garion
And will that actually damage the LCD panel?

It is really not a prob as I got this XPS as a gift from DELL but I wanna know if switching off vsync and letting the card channel 90 FPS on the LCD panel will actually set it aflame or something lol
It will be fine.

Quote:
What makes you think you would even see frame rates higher than 60? Television is 30 fps with interlaced fields, which technically is 60 fps. Movies are 24 fps. Why would you want a higher frame rate than television?
That's just plain wrong.
http://www.uca.edu/org/ccsmi/ccsmi/c...0Revisited.htm

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Hockster
Quote:
Originally Posted by Garion
And will that actually damage the LCD panel?

It is really not a prob as I got this XPS as a gift from DELL but I wanna know if switching off vsync and letting the card channel 90 FPS on the LCD panel will actually set it aflame or something lol
It will be fine.

...<SNIP>...
It won't immediately damage the LCD display, but it will wear out the LCD pixels much faster, since these have a finite amount of on/off switchability. You'll notice reduced color vibrance much sooner than normal.

There's not much to be gained by running GW at >60hz refresh rate. You'll usually see more video tearing with V-Sync disabled.

Hockster

Hockster

Banned

Join Date: Jul 2005

I've yet to see any proof anywhere that it causes premature failures.

Xue Yi Liang

Xue Yi Liang

Jungle Guide

Join Date: May 2005

Northern CA

Outlaws of the Water Margin

Mo/Me

Quote:
Originally Posted by Willy Rockwell
What makes you think you would even see frame rates higher than 60? Television is 30 fps with interlaced fields, which technically is 60 fps. Movies are 24 fps. Why would you want a higher frame rate than television?
You can actually perceive at even higher than 60 frames - as I recall the upper limit was anywhere from 90fps(don't quote me on that figure) to 200fps (noted in tests with military fighter pilots). With higher framerates there is more information sent to the brain translating into a heightened sense of "reality."

In fact, a process, called Showscan, has been promoted by special effects man, Douglas Trumbull since the late 70's. Showscan was a new film format projecting 60fps of 70mm film. The resulting image is supposedly startilingly lifelike when shown to an audience - even more convincing than 3D or anything else witnessed on a big screen.

Taken from an article from the October 2003 issue of Digital Video. "24p: Back to the Future?":

"When Douglas Trumbull developed Showscan (70 mm at 60 fps) in 1976, he noted a profound psychological reaction among his test audiences when the frame rate hit 60 fps: The film ceased to be a film and was more like a window into reality. It just wasn't any good for storytelling, Trumbull claimed. Showscan was thus relegated to theme park immersive-entertainment venues, and a grand experiment in theatrical storytelling frame rates was shunted aside. (Of course, the scandalous print costs for 60 fps 70 mm could be part of the reason for the format's limited adoption.)....

...All analog television standards today employ interlaced rendering with a field rate based on the local power line frequency (or at least historically derived therefrom), and the power line frequency lurks in most DTV standards and production formats as well. NTSC broadcasts run with an approximate field rate of 60 Hz; the two most common HDTV formats in North American broadcasting are 1080/60i (interlaced) and 720/60p (progressive). All share that 60 Hz motion update rate that Trumbull found to be the dividing line between 'film' and 'reality.' ....Yet today we've learned to decouple production rates from display rates: 24 fps film appears on telly using 3:2 pulldown; high-end PAL sets often double-scan each frame for reduced flicker; frame stores let cameras capture 24p imagery and record it as 60i"


The reason why it hasn't taken off were due to the expense and practicality of filming and projecting 60frames of 70mm film - which demanded tons of film and extremely powerful projection bulbs, for instance. Recently his company filed for bankruptcy, which is a pity. I wish I had a chance to experience it just once.

here's his website: http://www.showscan.com/company_1.htm

----------
So what's the big deal about 60 fps in film if that can already be observed in PC games? I suppose gaming at 60 fps or more isn't the same as film since you're dealing with the limitations of computer graphics which may or may not apply a convincing motion-blur for each frame (field), for example. The graphics quality, in this case, would be your limiting factor for a more convincingly "real" experience. Anyway, that's how I understand it.

Numa Pompilius

Numa Pompilius

Grotto Attendant

Join Date: May 2005

At an Insit.. Intis... a house.

Live Forever Or Die Trying [GLHF]

W/Me

And surely everyone can see that movies stutter?
Now that I'm used to higher framerates, going to theatres annoy the heck out of me because every time the camera pans or moves, I can see the stutter.

Personally I want higher than 40 frames per second at all times. If I get that, I'm good.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Hockster
I've yet to see any proof anywhere that it causes premature failures.
Not premature failure, but premature pixel wear/fade. LCD TFT's are, in essence, electronic switches with mechanical parts... read about it here:

http://graphics.tomshardware.com/dis...624/index.html

Anything with moving parts, even at the scale TFT-polarized filters are made, will eventually wear out. Most LCD's have a 50000hr MTBF at 60hz.

Sure, users can go with higher refresh rates, but what's the point when LCD-TFT's don't refresh? CRT's refresh, LCD's don't (which is also why they don't flicker). Their pixels stay on and don't fade until switched off, unlike their CRT couterparts, whose pixels start fading after each refresh strobe. This is why DVI signal specs are capped at 60hz for conventional flat panel displays, since there is no visible benefit from running higher vertical refresh rates.

I did so much homework on this stuff as an FPS gamer, my head still hurts...

stone433

Ascalonian Squire

Join Date: Nov 2005

E/N

Is there any other way you can modify GW other then -perf?

Hockster

Hockster

Banned

Join Date: Jul 2005

Not saying you're wrongh, but I've learned to take anything coming from Tom's Hardware with four or five grains of salt, especially an article that's six years old.
His methods were not always accurate, or even true at times.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by Hockster
Not saying you're wrongh, but I've learned to take anything coming from Tom's Hardware with four or five grains of salt, especially an article that's six years old.
His methods were not always accurate, or even true at times.
The TFT polarized filter switching hasn't changed much (just performance and yield rates), and none of the more recent articles explained the mechanics of TFT's at the pixel level.

Also, THG didn't write those TFT diagrams solo -- their sources were Toshiba, Samsumg, Sony, and other TFT manufacturers who actually build these displays.

The next big display technology is Lumileds, but those still have latency bugs to work out.

pkodyssey

pkodyssey

Wilds Pathfinder

Join Date: Nov 2005

In a cardboard box with Internet

The Order of the Frozen Tundra (TofT)

N/

I heard the -heapsize setting helped, but I have seen no improvement with it on. I have 3GB of PC3200 on the board and setting -heapsize 1536000 has done nothing to improve performance. I got rid of it and am now using -perf for the heck of it. Anybody have insight into -heapsize??

FazeDx

Ascalonian Squire

Join Date: Oct 2005

W/Mo

60 fps is enough. if u really want over 100 fps or whatnot u can overclock ur video card, and i can tell u how, but i DONT recommend it. 60 fps is fine. U can turn ur desktops refresh rate higher and it might affect the game. just try. i dont know.

Seth3 Errow

Ascalonian Squire

Join Date: May 2005

Melbourne, Australia

W/Mo

-heapsize will allow more ram to be used so loading textures and areas would be quicker, not to create awesome FPS.

EternalTempest

EternalTempest

Furnace Stoker

Join Date: Jun 2005

United States

Dark Side Ofthe Moon [DSM]

E/

Make sure you have your exact monitor driver installed for xp. If you running off "generic" one, you max refresh rate is determined by this driver.

Most users don't bother to update the mointer driver and run off "generic".

If you are running generic, and install the correct driver, you may have high refresh rate option within the specs of your monitor.

lord_shar

lord_shar

Furnace Stoker

Join Date: Jul 2005

near SF, CA

Quote:
Originally Posted by EternalTempest
Make sure you have your exact monitor driver installed for xp. If you running off "generic" one, you max refresh rate is determined by this driver.

Most users don't bother to update the mointer driver and run off "generic".

If you are running generic, and install the correct driver, you may have high refresh rate option within the specs of your monitor.
Also, if you have a flat panel and connect via DVI, the driver will usually cap the framerate at 60hz (reasons already mentioned previously in this thread).