I have implemented dynamic resolution rendering in my engine, which requires a measurement of the time it takes to render each frame.
When vsync is disabled (ie sync interval == 0), I can just measure time between each call to Present() using CPU timers, and everything works fine.
However, when vsync is enabled (sync interval == 1), this no longer works, and I always get 16ms (monitor refresh rate).
Are there any tricks for measuring accurately the actual time it took the GPU to execute the draw commands when using vsync, for the purpose of driving my dynamic resolution scale?