Originally posted by: bronzeshield
Some really weird notions in this thread. The current state-of-the-art of NES emulation is better than just about any other console except maybe the SNES (thanks to byuu), so I'm not sure why it's being used as an example of something that's subpar -- even top-shelf emulators for systems like the Atari 2600 and Sega Genesis still have weird compatibility issues and inaccuracies. And 200ms is an insane amount of lag that would dramatically affect even a casual gamer when playing twitch games like Punch-Out, VCS Kaboom, any number of shmups, etc. I certainly notice that much lag playing Kaboom, or any Atari game, on my sister's midrange LCD. High-end studio musicians are affected by as little as 20ms of lag or less -- Jeff Porcaro could allegedly detect 5ms. It's not as simple as single-event reaction time; it also has to do with patterned behavior, the whole idea of "getting into a rhythm", and a bunch of other parameters. It still seems to me that there's an easy test for assessing total system lag. Build a ROM that does nothing but two things: turns the screen white, and emits a beep, on controller input. (Have it check on every frame, or even every scanline if possible.) Then take two contact mics and a video camera. Tape one mic to the controller, near the fire button, and the other one to the speaker you're using. Start recording, boot the ROM, hit the fire button on the controller, and measure the distance between the attack transient (sound) of the button being pressed and the beep coming from the speaker. That's your audio lag, which may or may not be the same as the video lag (it should be, but might not). Finally, study the video feed frame-by-frame to discern the difference between the screen flash and the beep. Wouldn't that give a reliable absolute number? Then you could toggle every manner of variable to see how it changed things: LCD vs. CRT, emulation vs. real hardware, hard-wired vs. wireless controllers, etc.
I said 200ms of "total latency" not input lag. 200ms input lag WOULD be insane.
"Testing has found that overall "input lag" (from controller input to display response) times of approximately
200 ms are distracting to the user" (
http://en.wikipedia.org/wiki/Inpu...)
Please, if you can, point me to an NES emulator that can achieve the following:
-full screen mode, with black bars on the side but not the bottom
-correct aspect ratio (1.143 for NES)
-no screen tearing
-no (or very little input lag)
-very close to perfect accuracy/timing.
-controller support
-no/very little sound issues.
Like I said, I've messed around with half a dozen NES emulators and was unable to achieve all of the above with any of them. Nestopia would be awesome if vsync didn't result in horrible input lag. So you either have to accept horrible screen tearing or horrible input lag. All the emulators I've tried had similar issues. With Fceux I could achieve full screen with no screen tearing and very little input lag, but the accuracy of that emulator seemed really poor and/or it just didn't run well on my machine.
You also perfectly described Blaarg's NES reflex timer. Using it I get a total average time of less than 240 ms using retroarch, where the reference is 200 ms using a real NES connected to a CRT. So that's less than 40ms of input lag.
Don't shooting fanatics love the XRGB? And doesn't it add 2 frames of lag on top of whatever your display (minus deinterlacing) has? I just can't imagine that a few ms of lag could possibly be an issue. All the shooting game and fighting fanatics don't seem to mind.
PatrickM.