I am confused by how QueryPerformanceCounter works. I have used QueryPerformanceFrequency to get my CPU ticks, and I know that works because it returns the speed of my CPU in Hz correctly. Now I have searched Google and have come up with this kind of thing;
Now apparently you're supposed to save the before value and compare it to the after one. However when I do the following;
It's like 10000000 different when it should only be a couple of thousand. So what does QueryPerformanceCounter actually return? It always seems to wrap at around the 32-bit 4 billion mark.
Any help/examples?
Code:
QueryPerformanceCounter(&lpPerformanceCount);
(Run a frame)
QueryPerformanceCounter(&lpPerformanceCount);
(Run a frame)
QueryPerformanceCounter(&lpPerformanceCount);
Now apparently you're supposed to save the before value and compare it to the after one. However when I do the following;
Code:
QueryPerformanceCounter(&lpPerformanceCount);
(Message Box Displaying lpPerformanceCount)
QueryPerformanceCounter(&lpPerformanceCount);
(Message Box Displaying lpPerformanceCount)
(Message Box Displaying lpPerformanceCount)
QueryPerformanceCounter(&lpPerformanceCount);
(Message Box Displaying lpPerformanceCount)
It's like 10000000 different when it should only be a couple of thousand. So what does QueryPerformanceCounter actually return? It always seems to wrap at around the 32-bit 4 billion mark.
Any help/examples?