Originally posted by: bunnyboy
That was my original plan, but I couldn't find a light sensor that would work. Did you ever get to that point? I wonder how much the brightness matters for sensors like the bodnar. I am ending the test when I see the first sign of the new graphics which still might be too dark for it to register. That could mean a 5mS difference between my camera results and an optical sensor.
My plan is more lo-tech. An LDR (light-dependent resistor) in a bridge configuration, connected to a comparator. Then the comparator threshold is set to a suitable level. This would have to be decided manually by looking at the LDR response on an oscilloscope.
As for your camera, here's an idea: flip the camera 90 degrees and see if you're getting a diagonal transition pattern in each frame. Assuming the camera's frame rate is well above the screen's (which would be required for this to be useful at all) and that it has a rolling shutter, this means that for every line that the camera records, the screen will have drawn slightly more of the new frame. If you don't rotate the camera, the screen and camera will just meet at some scanline, and you will simply get a flat tear line per frame. In other words, rotating the camera gives you more information that yo can parse. Again, this assumes a high speed camera and rolling shutter.
Mine is already starting at the poll, but another LED on the button itself was the next step. I think that is only significant for emulators where the hardware poll does not align with the NES game poll. It would be awesome to measure a tetris player to see how much time they have between button press and button poll!
'cactly. This would mainly be interesting for emulators, where the processes are asynchronous. On a real NES with an ideal CRT everything is synchronous (aligned) and the controller lag (depending on when in a frame you happen to push the button) is predictable to 0-16.7 ms. Likewise, the screen draw takes 16.7 ms, which gives a total worst-case scenario of 33.4 ms (you just barely missed the poll, and you're looking for a change at the bottom of the screen) and a total best-case scenario of nearly 0 ms, (where you exactly hit the poll, and also are looking for an change near the top of the screen.) This is also assuming that the game is processing the input data for the same frame as it was polled.
The less interesting scenario in the case of an emulator or digital TV setup, is where every step is introducing latency, so that the worst case is always worse than a CRT. The more interesting case is a theoretical emulator setup where all parts are specially made as an integrated system, and the screen redraw is approaching 0 ms, (say something like an OLED screen where you can push data extremely quickly to the screen, and also make the pixels react quickly) and where you have a deadline type emulation which waits to the absolute end of the framebefore processing the input. So, you would still process events at 60 Hz, but all processing would be done in one burst. The input would be read in the last possible moment, then processed (one frame is emulated, and rendered) and then blitted in a time approaching 0 ms. With this theoretical setup, you will actually beat the NES/CRT setup, because the absolute worst-case input-output time is now approaching 16.7 ms. All of this is arguably beyond the human threshold of reaction, but it's an interesting theoretical discussion.
I am already picking a consistent point on screen, so that shouldn't make a difference other than a constant add or subtract from the final number. The lcd/plasma panel won't be able to draw any sooner simply because the NES/emulator won't have output the video data yet.
I think displaylag.com had more info about how they average top/bottom/middle to account for different drawing routines but I'm not sure that is significant enough to care about.
Well yeah, this is speculative on my part, and would only affect digital monitors (whether driven by HDMI or composite.) An ideal CRT already offers the lowest possible latency and of course can't draw data before it gets it. But a digital LCD TV could for example do something weird like buffer the stream into a framebuffer and then draw it as fast as the panel can be driven, say 8 ms for the sake of argument. The point here is more one of curiosity and creating a rig that is as generic as possible, so it would be able to detect this property in a monitor. If I would choose just one pointon the screen, it would be near the bottom.
I don't think LPT port is low latency enough, there are just too many layers of software and hardware in the way. Not sure of anything else that would be faster without some extreme hardware tho. Are there any USB 3.0 keyboards to use the num lock light?
I would have to speculatively disagree. There's no reason why a USB 3.0 keyboard would do any better, assuming that the status LEDs are just updated at the time of polling the keyboard. Windows polls the keyboard at a rate of 125 Hz. I'm guessing this value was chosen to guarantee two polls per frame @60 Hz, while keeping power consumption down by allowing the CPU to potentially halt completely for as long periods as possible when idle, assuming nothing else interrupts the CPU. By increasing the poll rate to 1000 Hz, (the maximum possible as per the USB standard, at least USB1 and USB2, haven't read the USB3 spec) you should be able to get the latency of this event down to approaching 1 ms.
But I don't see why LPT port communication would have to slow, if you make a special driver which allows direct IO port access. Yes, the call would have to bubble down to ring 0, but this is done for all hardware calls, and with a proper driver this should be a fast lightweight operation. (Never mind the added hassle of making such a driver, implementing support for it in emulators and bypassing the driver signature enforcement.) The only real issue I might see is if the LPC bus (which the LPT port probably sits on on any modern computer) somehow adds lag. But that could be measured and remains to be seen.