Turning on V-Sync

1 pages Page 1
steevo301
steevo301
Academy Page
#1
Hey, I was just wondering if anyone could tell me how to turn on V-Sync to make it work on GW. I know how to get to the NVIDIA Control Panel, and I'm pretty sure I know where to go to do it. (If not, please correct me)

My problem is that under the "Global Settings" tab, there's a drop-down list called "Global Presets". Which option in this long list should I pick before turning on V-Sync?

Thanks
zamial
zamial
Site Contributor
#2
in guild wars>options>graphics>vsync is a check box. its near the bottom.
Kaida the Heartless
Kaida the Heartless
Desert Nomad
#3
You don't need to do anything with your video card. You only need to check the above setting in guildwars to make it work.

All Vsynch does is make sure that the game doesn't produce framerates any faster than your current monitor setting.
steevo301
steevo301
Academy Page
#4
Oh, well that's easy enough.

Thanks for the help!
The Muffen Man
The Muffen Man
Krytan Explorer
#5
talking about v sync is there any need to use it? Guild wars runs at around 300 fps with everything to max.

Will that kind of fps damage my monitor?
moriz
moriz
??ber t??k-n??sh'??n
#6
none. if the fps goes over the refresh rate, the additional frames simply won't be displayed.

however, if your fps greatly exceeds your refresh rate, you will sometimes get "screen tearing", which means the top and bottom part of of your screen desyncs, and you get broken images. that's the main reason why we use vertical sync.
Numa Pompilius
Numa Pompilius
Grotto Attendant
#7
nVidia cards do not get tearing, so vsync should always be left off.
Kaida the Heartless
Kaida the Heartless
Desert Nomad
#8
If you are running an LCD monitor, it is suggested you turn this option on. Here's why (if your interested):

LCD's do not technically have refresh rates. Instead, they have a "light this pixel up this way until I tell you to change" scenario. So, technically, if you're staring at something that is entirely the same color (let's say pure red for example) then your monitor technically doesn't have to update. Unfortunately, this is not the case. Instead, because of Windows, the monitor must act like any other monitor, the pixels must relight themselves as often as your framerate suggests.

Basically, if you're pushing alot of frames without Vsync on, your monitor is doing alot of work it's not supposed to i.e. it's going to die sooner. The choice basically comes to:
Do I want to use my monitor for a longer time with "worse" fps, or shorter with "better" fps?
Brianna
Brianna
Insane & Inhumane
#9
Quote:
Originally Posted by Numa Pompilius
nVidia cards do not get tearing, so vsync should always be left off.
Lol what?

Nvidia cards do too get tearing, both my 8800GTS and 8600GT in two different computers get tearing, Vsync is necessary if your card is pushing tons of frames.
The Muffen Man
The Muffen Man
Krytan Explorer
#10
Quote:
Originally Posted by Brianna
Lol what?

Nvidia cards do too get tearing, both my 8800GTS and 8600GT in two different computers get tearing, Vsync is necessary if your card is pushing tons of frames.

So whats tones? so if Im getting over 60 fps should i use v-sync?
Crimson Flame
Crimson Flame
Better Than Arkantos
#11
Quote:
Originally Posted by The Muffen Man
So whats tones? so if Im getting over 60 fps should i use v-sync?
Simple answer to this is yes. You're not going to get any discernible benefit from showing 300 FPS, since the human eye can't see changes that fast, and you run the risk of messing up your monitor.
Quaker
Quaker
Hell's Protector
#12
60 fps is more than adequate, and beyond what the human eye needs in order to perceive smooth motion (motion pictures are only 24 fps).

Having more than 60 fps (or whatever the Vertical Refresh (VR) rate is) will have no noticeable effect upon gameplay.

Your monitor will only be displaying at IT's VR rate regardless of the fps of the game. The video card, or Graphics Processing Unit (GPU), has a "frame buffer", whose output is sent to the monitor at the VR rate. If the GPU is running at more than the VR, parts of this frame buffer's contents may change before or while they are being sent to the monitor. This results in 'tearing' or other graphic artifacts. (The screen still doesn't update at more than the VR rate though)

However, running the GPU at more than the VR rate will cause it to work harder, consuming more power and generating more heat.

P.s. A case can be made in First Person Shooters (think Counterstrike), for frame rates above 60, but that has more to do with ping rates and response time, and even then, 100fps is more than enough even for those who say it matters. At any rate, that doesn't apply to GW's combat system as much, if at all.