How can I avoid waiting for vblank?
Windows'' SwapBuffers() call seems to automatically wait for a vertical blank before flipping. Is there any way to force it to flip?
With vsync on, the maximum framerate I can get is 60fps which is no use at all when I''m trying to see performace differences. I could just disable the swap for a few seconds, I suppose, and store the framerate during that period...
see messageboards, driver controls and why the heck only 60hz? you should be running at least 75hz.
quote:
Original post by a person
see messageboards, driver controls and why the heck only 60hz? you should be running at least 75hz.
If he''s using Win2K (maybe XP too?), there''s some issue with NVidia cards that maxes it at 60 Hz (it may have been fixed, I haven''t kept up with NVidia''s Windows driver releases for a while, so I may be out of date


for info on the moniter refresh limitations see http://xp-refresh.net/ (and a fix for nividia cards (since this is an xp issue not a driver issue, there are other fixes as well for other cards, but some are not free).
basically the problem was MS got silly so they maxed d3d at 75hz (makes no sense) and opengl at 60hz (again makes little sense) and desktops default to 75hz (i think). i think the idea was that since moniters may report usable refresh rates wrong in some cases, they felt it better to neuter things. naturally if you are gonna neuter part of a video sub system, you might as well cut back the competing graphics api with a lower refresh rate.
personally i run my moniter at 120hz, since i cant stand anything lower then 85hz (the flicker is just too much to bear).
basically the problem was MS got silly so they maxed d3d at 75hz (makes no sense) and opengl at 60hz (again makes little sense) and desktops default to 75hz (i think). i think the idea was that since moniters may report usable refresh rates wrong in some cases, they felt it better to neuter things. naturally if you are gonna neuter part of a video sub system, you might as well cut back the competing graphics api with a lower refresh rate.
personally i run my moniter at 120hz, since i cant stand anything lower then 85hz (the flicker is just too much to bear).
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement