Skip navigation
NintendoAge
Welcome, Guest! Please Login or Join
Loading...

"Console-perfect" NES emulation It's 2014. Why is this so hard?

Apr 29, 2015 at 1:13:01 AM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: PatrickM.

I like the way the original Wii crops/stretches NES games (not sure if it's universal or just NES games). They crop the bottom 4, top 12, and stretch to fill 4:3.

Wii U VC is doing something horrible. It also crops the bottom 4 and top 12 but then instead of stretching the remaining image to fill the 4:3 area, they trim the 4:3 area down to a slightly smaller 4:3 area so that when the image is stretched it retains the same pixel dimensions as an unaltered 4:3 image with all the overscan displayed. To me this just looks really wrong. If there's overscan missing at the top and bottom then the image needs to be vertically stretched accordingly. Not only is it wrong because no CRT could ever look like that, it results in awful letterboxing. Why did they have to implement this horrible cropping method? And why is the color palette so dark? So far I like VC on the Wii a lot better than the Wii U.

Also, why can't Nintendo implement a simple scanline option for those that prefer them?

Retroarch 1.0.0.3 will have overscan options so that it will be easy to crop the amount you want and resize it to 4:3. For now, I figured out I can emulate the same pixel dimensions as in old Wii VC by setting the video aspect ratio in RA to 1.2 (6/5) and leaving overscan on.


Virtual Console (Wii-U) is terrible. Far, far from accurate. Has
.
(1) terrible scaling,
(2) wrong aspect ratio,
(3) no raster filter (ie; no "scanlines"),
(4) terrible lag, even worse than the lag found on the Wii VC, and
(5) wrong sound affects and/or visual glitches from time to time.

Glad I abandoned the emulator ship long ago. Playing the real thing, on a CRT is glorious. For more reasons than just the ones I listed.

-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 29, 2015 at 8:16:38 AM
Ozzy_98 (8)
avatar
< Bowser >
Posts: 6369 - Joined: 12/11/2013
Ohio
Profile
Why do people say "wrong aspect ratio" for emulators?  That's impossible, they're doing a 1:1 pixel mapping.  And the NES doesn't produce scan lines, your TV does.  If the "aspect ratio" is wrong, then pump the emulator out to a CRT and stop blaming the emulator for stuff it's not supposed to do.  For example:



(Ignore the messy floor)
Noticable diffrence in aspect ratio between these two TVs, more so if you're in person than in the pic.  The left one was squished to hell, and they both chop off a huge amount of the picture.   There is no correct aspect ratios, all video monitors rendered it diffrently. 

Apr 29, 2015 at 11:50:59 AM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: Ozzy_98

Why do people say "wrong aspect ratio" for emulators?  That's impossible, they're doing a 1:1 pixel mapping.  And the NES doesn't produce scan lines, your TV does.  If the "aspect ratio" is wrong, then pump the emulator out to a CRT and stop blaming the emulator for stuff it's not supposed to do. 

I was simply referring to the Wii-U Virtual Console. Their NES emulation is a perfect example. All you have to do is hook up your Wii-U to one CRT via a Wii composite cable, and your original NES to the other, and you'll see it immediately. The Wii-U squishes the picture vertically (at least it did last time I used it, which was two years ago, maybe its fixed? Nah....)

And 1:1 pixel mapping is wrong also. Take the SNES and Genesis for example. They both output a bordered, 15khz, 4:3 picture on your TV in 240p. Totally different than what an emulator will show you on your PC. On your PC your SNES/Genesis will give you square pixels, but the real consoles do not display square pixels on your CRT after the D/A conversion.




-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 29, 2015 at 3:29:14 PM
Ozzy_98 (8)
avatar
< Bowser >
Posts: 6369 - Joined: 12/11/2013
Ohio
Profile
Originally posted by: Kyle_Blackthorne

And 1:1 pixel mapping is wrong also. Take the SNES and Genesis for example. They both output a bordered, 15khz, 4:3 picture on your TV in 240p. Totally different than what an emulator will show you on your PC. On your PC your SNES/Genesis will give you square pixels, but the real consoles do not display square pixels on your CRT after the D/A conversion.
 

The emulator generates a single pixel, and the nes or gen will also generate a single pixel.  If you hook your computer up to a TV and it doesn't look like a nes via the TV out, for example the pixels are not the correct aspect ratio, you need to tweak the TV out.  By default it'll be showing everything with no overscan.

Sources have no conect on if a pixel is square or not really.  It's a "I can send this many via X and this many via Y."  So for square pixels, you need a res thats 4:3.  320x240 for example is a 4:3 ratio, aka 1.33333, so you get square pixels.  320x200 is not a 4:3 res for example, so the pixels are not square. The "standard" nes res is 256x240, this is a 1.06666666 aspect ratio.  So the nes actually outputs almost square pixels on a 4:3 display with no overscan. 

This isn't fully true anyways, the nes really does more along the lines of 280 res and only uses 256 of the actual pixels. And most TVs show about 224 lines of picture in the safe area (My CRTs all show less cause I suck at getting CRTs)

But for my setup, when I use an emulator on the CRT, other than the color diffrence, they look alike.  There's scan lines on the CRT, and the ratio is correct cause I adjusted the overscan to match, like you're supposed to on your TV out settings.  The Wii-I doesn't allow that via a pure component, but it's not a limitation of the emulator so much as the Wii-U itself. And it should be configurable, because people don't seem to notice, many modern TVs still have overscan.  Because TV makers are evil assholes who learned that if you add a bit of overscan, people in movies and in the news look bigger, so your TV sells more often.

Apr 29, 2015 at 4:34:13 PM
PatrickM. (1)
avatar
< El Ripper >
Posts: 1390 - Joined: 10/28/2013
Texas
Profile
Originally posted by: Kyle_Blackthorne




Virtual Console (Wii-U) is terrible. Far, far from accurate. Has
.
(1) terrible scaling,
(2) wrong aspect ratio,
(3) no raster filter (ie; no "scanlines"),
(4) terrible lag, even worse than the lag found on the Wii VC, and
(5) wrong sound affects and/or visual glitches from time to time.

Glad I abandoned the emulator ship long ago. Playing the real thing, on a CRT is glorious. For more reasons than just the ones I listed.



Normally I would agree with you, but Retroarch on Linux is amazing. Every one of the concerns you have listed has been addressed:

1. It has perfect integer scaling at a resolution of 960x720 or 1280x960

2. I'm playing in the "correct" 4:3 aspect ratio

3. There are numerous scanline options, from the standard scanline overlay which looks fantastic IMO (comparable to an xrgb mini), to CRT filters and shaders that can replicate bloom, color bleed, etc, if you're into that sort of thing.

4. The lag on linux is nearly 0. This is probably the biggest difference between running Linux vs not. There is some kind of magic in the Linux version that reduced the input lag from the emulator to nearly nothing. On a low lag tv with game mode enabled, I cannot tell the difference between this and a real CRT, and I'm pretty sensitive to input lag.

5. Have not noticed any sound/visual glitches. There are a few minor things I might be able to pick up on in a side by side comparison vs a real CRT, but nothing that detracts from the overall experience or immersion IMO.

-------------------------
My backlog / games completed
 


Edited: 04/29/2015 at 04:40 PM by PatrickM.

Apr 30, 2015 at 1:03:37 AM
Orpheon (2)
avatar
< Eggplant Wizard >
Posts: 245 - Joined: 10/18/2014
Texas
Profile
I think I may have posted this elsewhere, but it was this exact discussion that got me into collecting. Specifically, it was this article: http://arstechnica.com/gaming/201...
I found it fascinating, and understanding the difficulty of emulation and the differences between emulating and using a console kinda made me appreciate the old hardware a bit more.

Apr 30, 2015 at 2:00:50 AM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: PatrickM.

Originally posted by: Kyle_Blackthorne


Virtual Console (Wii-U) is terrible. Far, far from accurate. Has
.
(1) terrible scaling,
(2) wrong aspect ratio,
(3) no raster filter (ie; no "scanlines"),
(4) terrible lag, even worse than the lag found on the Wii VC, and
(5) wrong sound affects and/or visual glitches from time to time.

Glad I abandoned the emulator ship long ago. Playing the real thing, on a CRT is glorious. For more reasons than just the ones I listed.

Normally I would agree with you, but Retroarch on Linux is amazing. Every one of the concerns you have listed has been addressed: 1. It has perfect integer scaling at a resolution of 960x720 or 1280x960 2. I'm playing in the "correct" 4:3 aspect ratio 3. There are numerous scanline options, from the standard scanline overlay which looks fantastic IMO (comparable to an xrgb mini), to CRT filters and shaders that can replicate bloom, color bleed, etc, if you're into that sort of thing. 4. The lag on linux is nearly 0. This is probably the biggest difference between running Linux vs not. There is some kind of magic in the Linux version that reduced the input lag from the emulator to nearly nothing. On a low lag tv with game mode enabled, I cannot tell the difference between this and a real CRT, and I'm pretty sensitive to input lag. 5. Have not noticed any sound/visual glitches. There are a few minor things I might be able to pick up on in a side by side comparison vs a real CRT, but nothing that detracts from the overall experience or immersion IMO.

The linux might be lag free (doubtful), but the emulator is not. Emulators are naturally laggy (and the filters may possibly increase that lag in most cases). Retroarch on the original Wii console averaged 3-4 frames of lag for me via proper test methods (on a CRT) when compared to a real console. You might not "feel" it, but its there and it will, without a doubt, hurt your gameplay. Try the Clinger Winger stage on Battletoads. See for yourself. And as far as accuracy is concerned, Retroarch's core's will never be truly accurate, especially seeing as how BSNES is not 100% accurate. Maybe nothing that you will notice, but there was some things that I spotted immediately with many games on the Wii version on the SNES and NES especially.

But if your happy with your setup, then good for you. As for me, I'll continue to support retro game sellers on ebay. They are bringing me games that I cannot find around here for a good price, or at all. They are worth supporting, especially those who do it for a living and take the time to properly refurbish them. Real hardware is amazing and there's nothing like it. Its absolutely worth every penny. But to each their own. If your happy, then I'm happy for you.


-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 2:03:03 AM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: Orpheon

I think I may have posted this elsewhere, but it was this exact discussion that got me into collecting. Specifically, it was this article: http://arstechnica.com/gaming/2011/08/09/accuracy-takes-powe...
I found it fascinating, and understanding the difficulty of emulation and the differences between emulating and using a console kinda made me appreciate the old hardware a bit more.

This.

Yes, realizing the short coming of emulators in general, was just more motivation for me to stick to the original consoles. And the shortcomings were not even remotely minor ether (well, not for me at least)


-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 4:10:51 AM
PatrickM. (1)
avatar
< El Ripper >
Posts: 1390 - Joined: 10/28/2013
Texas
Profile
Originally posted by: Kyle_Blackthorne

Originally posted by: PatrickM.

Originally posted by: Kyle_Blackthorne


Virtual Console (Wii-U) is terrible. Far, far from accurate. Has
.
(1) terrible scaling,
(2) wrong aspect ratio,
(3) no raster filter (ie; no "scanlines"),
(4) terrible lag, even worse than the lag found on the Wii VC, and
(5) wrong sound affects and/or visual glitches from time to time.

Glad I abandoned the emulator ship long ago. Playing the real thing, on a CRT is glorious. For more reasons than just the ones I listed.

Normally I would agree with you, but Retroarch on Linux is amazing. Every one of the concerns you have listed has been addressed: 1. It has perfect integer scaling at a resolution of 960x720 or 1280x960 2. I'm playing in the "correct" 4:3 aspect ratio 3. There are numerous scanline options, from the standard scanline overlay which looks fantastic IMO (comparable to an xrgb mini), to CRT filters and shaders that can replicate bloom, color bleed, etc, if you're into that sort of thing. 4. The lag on linux is nearly 0. This is probably the biggest difference between running Linux vs not. There is some kind of magic in the Linux version that reduced the input lag from the emulator to nearly nothing. On a low lag tv with game mode enabled, I cannot tell the difference between this and a real CRT, and I'm pretty sensitive to input lag. 5. Have not noticed any sound/visual glitches. There are a few minor things I might be able to pick up on in a side by side comparison vs a real CRT, but nothing that detracts from the overall experience or immersion IMO.

The linux might be lag free (doubtful), but the emulator is not. Emulators are naturally laggy (and the filters may possibly increase that lag in most cases). Retroarch on the original Wii console averaged 3-4 frames of lag for me via proper test methods (on a CRT) when compared to a real console. You might not "feel" it, but its there and it will, without a doubt, hurt your gameplay. Try the Clinger Winger stage on Battletoads. See for yourself. And as far as accuracy is concerned, Retroarch's core's will never be truly accurate, especially seeing as how BSNES is not 100% accurate. Maybe nothing that you will notice, but there was some things that I spotted immediately with many games on the Wii version on the SNES and NES especially.

But if your happy with your setup, then good for you. As for me, I'll continue to support retro game sellers on ebay. They are bringing me games that I cannot find around here for a good price, or at all. They are worth supporting, especially those who do it for a living and take the time to properly refurbish them. Real hardware is amazing and there's nothing like it. Its absolutely worth every penny. But to each their own. If your happy, then I'm happy for you.
 
 Just to give you a little background. I own real hardware and a CRT. I play and have beaten numerous twitch games, and I am well aware of input lag. I prefer the vastly superior video quality of emulators. With how little input lag there is with RA on Linux, I rarely use my CRT anymore. 

yes, retroarch on wii is laggy. Retroarch on Linux is not. There is a noticeable difference in input lag between the Linux version and other versions of RA. Give it a try, see for yourself. And a simple scanline overlay is not going to add any input lag. The more complicated shaders might, but depending on the game you're playing you might not even notice. I'm not a huge fan of shaders, myself, and just stick to a scanline overlay and a very light bilinear filter, which requires virtually no processing time (1 ms, maybe). 

have you ever encountered any inaccuracy that actually detracted from gameplay, and that would be noticeable to someone who hadn't played the game hundreds of times? Honestly, I think people make a huge deal out of nothing when it comes to accuracy. It's nitpicking, like (most) complaints about input lag, IMO. 

Of course, once the HDMI NES/AVS is released, I'll no longer have any reason to use NES emulators.




-------------------------
My backlog / games completed
 


Edited: 04/30/2015 at 09:59 AM by PatrickM.

Apr 30, 2015 at 8:19:31 AM
Ozzy_98 (8)
avatar
< Bowser >
Posts: 6369 - Joined: 12/11/2013
Ohio
Profile
Originally posted by: Kyle_Blackthorne

Originally posted by: PatrickM.

Originally posted by: Kyle_Blackthorne


Virtual Console (Wii-U) is terrible. Far, far from accurate. Has
.
(1) terrible scaling,
(2) wrong aspect ratio,
(3) no raster filter (ie; no "scanlines"),
(4) terrible lag, even worse than the lag found on the Wii VC, and
(5) wrong sound affects and/or visual glitches from time to time.

Glad I abandoned the emulator ship long ago. Playing the real thing, on a CRT is glorious. For more reasons than just the ones I listed.

Normally I would agree with you, but Retroarch on Linux is amazing. Every one of the concerns you have listed has been addressed: 1. It has perfect integer scaling at a resolution of 960x720 or 1280x960 2. I'm playing in the "correct" 4:3 aspect ratio 3. There are numerous scanline options, from the standard scanline overlay which looks fantastic IMO (comparable to an xrgb mini), to CRT filters and shaders that can replicate bloom, color bleed, etc, if you're into that sort of thing. 4. The lag on linux is nearly 0. This is probably the biggest difference between running Linux vs not. There is some kind of magic in the Linux version that reduced the input lag from the emulator to nearly nothing. On a low lag tv with game mode enabled, I cannot tell the difference between this and a real CRT, and I'm pretty sensitive to input lag. 5. Have not noticed any sound/visual glitches. There are a few minor things I might be able to pick up on in a side by side comparison vs a real CRT, but nothing that detracts from the overall experience or immersion IMO.

The linux might be lag free (doubtful), but the emulator is not. Emulators are naturally laggy (and the filters may possibly increase that lag in most cases). Retroarch on the original Wii console averaged 3-4 frames of lag for me via proper test methods (on a CRT) when compared to a real console. You might not "feel" it, but its there and it will, without a doubt, hurt your gameplay. Try the Clinger Winger stage on Battletoads. See for yourself. And as far as accuracy is concerned, Retroarch's core's will never be truly accurate, especially seeing as how BSNES is not 100% accurate. Maybe nothing that you will notice, but there was some things that I spotted immediately with many games on the Wii version on the SNES and NES especially.

But if your happy with your setup, then good for you. As for me, I'll continue to support retro game sellers on ebay. They are bringing me games that I cannot find around here for a good price, or at all. They are worth supporting, especially those who do it for a living and take the time to properly refurbish them. Real hardware is amazing and there's nothing like it. Its absolutely worth every penny. But to each their own. If your happy, then I'm happy for you.
 
So for a baseline to make sure it's the emulators you're seeing lag on, chack something like Punch out, it's pretty responsive.  How many frames of lag, if any, do you notice when you throw a punch on a CRT + real nes?



Apr 30, 2015 at 10:29:08 AM
bunnyboy (81)
avatar
(Funktastic B) < Master Higgins >
Posts: 7704 - Joined: 02/28/2007
California
Profile
Originally posted by: Kyle_Blackthorne

Retroarch on the original Wii console averaged 3-4 frames of lag for me via proper test methods (on a CRT) when compared to a real console.
What equipment were you using for those tests?

 

Apr 30, 2015 at 10:45:21 AM
arch_8ngel (68)
avatar
(Nathan ?) < Mario >
Posts: 35271 - Joined: 06/12/2007
Virginia
Profile
My question on the Wii emulation would be: how much of that latency is due to the bluetooth controller, versus the actual emulation?

You'd need to use a Wii remote as a bluetooth control on your linux box to have a fair comparison.

-------------------------
 

Apr 30, 2015 at 1:22:22 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: arch_8ngel

My question on the Wii emulation would be: how much of that latency is due to the bluetooth controller, versus the actual emulation?

You'd need to use a Wii remote as a bluetooth control on your linux box to have a fair comparison.

I didn't use bluetooth. I used the gamecube ports. There however wasn't much difference between the two


-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 1:22:38 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
So for a baseline to make sure it's the emulators you're seeing lag on, chack something like Punch out, it's pretty responsive.  How many frames of lag, if any, do you notice when you throw a punch on a CRT + real nes?

That is not how I check for lag. Testing what I feel or notice is inaccurate. Sure I'll notice issues with timing being off vs. the real thing, but exactly "how much" timing is missing is something I cannot determine without proper methods and tools.

-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 1:22:52 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: bunnyboy

Originally posted by: Kyle_Blackthorne

Retroarch on the original Wii console averaged 3-4 frames of lag for me via proper test methods (on a CRT) when compared to a real console.
What equipment were you using for those tests?

 

The best methods are,

1. Record (in 60fps) the time difference between your finger slamming into the button, and the action occurring on screen. Playback the video on your PC in a program that can break down the action in individual frames (ie; watch how long it takes for Mario to jump after you hit the A button).

2. Record the audio only. Get a sample of the sound of your finger slamming into the A button, vs. the sound that Mario makes when he jumps. Do this for both the real console, and the emulator (wired controller, not bluetooth). Then playback the audio files in something like Audacity and compare in milliseconds the difference.

The first method is best. The second method can be confusing, because developers didn't always code the sound effects to be in 100% perfect sync with the on-screen action. Sometimes the sound effect could be 10ms slower than the action occurs, sometimes 50ms slower. Depending on how efficient the rom-code was written. Nevertheless, this method can be perfectly fine to use if your camera doesn't record at 60fps. Just keep in mind what I stated.


-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 1:28:45 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: PatrickM.

I prefer the vastly superior video quality of emulators. With how little input lag there is with RA on Linux, I rarely use my CRT anymore. 

yes, retroarch on wii is laggy. Retroarch on Linux is not. There is a noticeable difference in input lag between the Linux version and other versions of RA. Give it a try, see for yourself. And a simple scanline overlay is not going to add any input lag. The more complicated shaders might, but depending on the game you're playing you might not even notice. I'm not a huge fan of shaders, myself, and just stick to a scanline overlay and a very light bilinear filter, which requires virtually no processing time (1 ms, maybe). 

have you ever encountered any inaccuracy that actually detracted from gameplay, and that would be noticeable to someone who hadn't played the game hundreds of times? Honestly, I think people make a huge deal out of nothing when it comes to accuracy. It's nitpicking, like (most) complaints about input lag, IMO. 

Of course, once the HDMI NES/AVS is released, I'll no longer have any reason to use NES emulators.

You think the video quality is superior? Wow....well....yea to  each their own. If it truly is superior, then you must be using a professional OLED BVM display. Surely you wouldn't make those comments regarding a LCD (or LED if you prefer fake names). With the horrendous motion blur, the terrible washed-out contrast ratio, the piss-poor black levels (especially in a dark room), and the "too sharp" picture for old school content, there's no way you could think that was superior to the perfect blacks, motion, etc of a CRT (especially one with a dark mask). So yea you must be using a professional OLED display (or perhaps one of the latest low-lag Plasma display)

Regarding Linux, I don't have it. I'm still on Windows 7. Lets assume that Linux however doesn't have any processing delay. OK, good. But what about other possible delays that might be present, maybe in the USB processing, or elsewhere? (forgive me I'm not knowledgeable about computers)

And yes, as stated earlier, I have encountered inaccuracies, and yes they have detracted from gameplay but only on games that I was already familiar with. Obviously! Yea if I'm not familiar with the game, then how would I know? Haha.

Speaking of accuracy, we must consider the fact that game developers from back then used composite/RF output on a small monitor for final analysis. Sure they developed pixels via RGB, because, well, they had a PC monitor on one side for graphics design, and the 13-20 inch TV monitor on the other side. But that TV monitor was there for the final analysis, so that they could see the final products via the consumer's eyes. Take the Sega Genesis as a good example. Dithering was abused greatly on that console due to the fact that the video encoder had no trap filter, thus increasing the color bleed. This was taken advantage of big time in many - if not most - of the top rated Genesis games. Dithered patterns were used like crazy, but they only work in composite/rf. Not RGB. But then again, 99.9% of all Genesis owners were using composite/RF, so that's how games were developed. So things like a HDMI Genesis, or a HDMI NES will always be wrong.

In other words, these game were never meant to be displayed on a too sharp, poor contrast ratio, LCD with even a hint of lag. They were meant for composite/rf, in 240p, on a CRT.

Kinda how most movies are meant to be viewed in 65k (not with exaggerated red saturation like most people view them). Viewing it the way the developer intended it to be viewed will always be accurate. You may prefer it to be viewed differently, but that doesn't make it correct. But as long as your happy, who cares? So long as your happy with your setup, then that's all that really matters in end. Some people may prefer to see it the way the developer intended, others may not. Different strokes for different folks.


Anyway, I think I've made a mistake posting in this thread. My initial post was simply put to clear up a few misconceptions and help the uninformed. Hence me stating in it "I don't want to argue". And maybe I'm wrong, but I think that's what this is has turned into (an argument). I may have achieved the opposite of my original intentions. So let me just say this, I believe I've said all I need to say in regards to everything, and thus I need to simply exit the conversation. Forgive me if I've come across as harsh or arrogant, I was initially only trying to help.

-------------------------
What happens when we die?

www.truthaboutdeath.com


Edited: 04/30/2015 at 03:06 PM by Kyle_Blackthorne

Apr 30, 2015 at 3:06:27 PM
PatrickM. (1)
avatar
< El Ripper >
Posts: 1390 - Joined: 10/28/2013
Texas
Profile
Originally posted by: Kyle_Blackthorne

Originally posted by: PatrickM.

I prefer the vastly superior video quality of emulators. With how little input lag there is with RA on Linux, I rarely use my CRT anymore. 

yes, retroarch on wii is laggy. Retroarch on Linux is not. There is a noticeable difference in input lag between the Linux version and other versions of RA. Give it a try, see for yourself. And a simple scanline overlay is not going to add any input lag. The more complicated shaders might, but depending on the game you're playing you might not even notice. I'm not a huge fan of shaders, myself, and just stick to a scanline overlay and a very light bilinear filter, which requires virtually no processing time (1 ms, maybe). 

have you ever encountered any inaccuracy that actually detracted from gameplay, and that would be noticeable to someone who hadn't played the game hundreds of times? Honestly, I think people make a huge deal out of nothing when it comes to accuracy. It's nitpicking, like (most) complaints about input lag, IMO. 

Of course, once the HDMI NES/AVS is released, I'll no longer have any reason to use NES emulators.

You think the video quality is superior? Wow....well....yea to  each their own. If it truly is superior, then you must be using a professional OLED BVM display. Surely you wouldn't make those comments regarding a LCD (or LED if you prefer fake names). With the horrendous motion blur, the terrible washed-out contrast ratio, the piss-poor black levels (especially in a dark room), and the "too sharp" picture for old school content, there's no way you could think that was superior to the perfect blacks, motion, etc of a CRT (especially one with a dark mask). So yea you must be using a professional OLED display (or perhaps one of the latest low-lag Plasma display)

Regarding Linux, I don't have it. I'm still on Windows 7. Lets assume that Linux however doesn't have any processing delay. OK, good. But what about other possible delays that might be present, maybe in the USB processing, or elsewhere? (forgive me I'm not knowledgeable about computers)

And yes, as stated earlier, I have encountered inaccuracies, and yes they have detracted from gameplay but only on games that I was already familiar with. Obviously! Yea if I'm not familiar with the game, then how would I know? Haha.

Speaking of accuracy, we must consider the fact that game developers from back then used composite/RF output on a small monitor for final analysis. Sure they developed pixels via RGB, because, well, they had a PC monitor on one side for graphics design, and the 13-20 inch TV monitor on the other side. But that TV monitor was there for the final analysis, so that they could see the final products via the consumer's eyes. Take the Sega Genesis as a good example. Dithering was abused greatly on that console due to the fact that the video encoder had no trap filter, thus increasing the color bleed. This was taken advantage of big time in many - if not most - of the top rated Genesis games. Dithered patterns were used like crazy, but they only work in composite/rf. Not RGB. But then again, 99.9% of all Genesis owners were using composite/RF, so that's how games were developed. So things like a HDMI Genesis, or a HDMI NES will always be wrong.

In other words, these game were never meant to be displayed on a too sharp, poor contrast ratio, LCD with even a hint of lag. They were meant for composite/rf, in 240p, on a CRT.

Kinda how most movies are meant to be viewed in 65k (not with exaggerated reds like most people view them). Viewing it the way the developer intended it to be viewed will always be accurate. You may prefer it to be viewed differently, but that doesn't make it correct. But as long as your happy, who cares? So long as your happy with your setup, then that's all that really matters in end. Some people may prefer to see it the way the developer intended, others may not. Different strokes for different folks.


Anyway, I think I've made a mistake. My initial post was simply put to clear up a few misconceptions and help the uninformed. Hence me stating in it "I don't want to argue". And I think that's what this is has turned into (an argument). I've obviously achieved the opposite of my original intentions. So let me just say this, I believe I've said all I need to say in regards to all everything, and thus I need to simply exit the conversation. Forgive me if I've come across as harsh or arrogant, I was initially only trying to help.
You must be using a piss-poor LCD or else haven't properly calibrated it. On a properly calibrated plasma or good LCD the results are amazing, superior to real hardware.

The native HDMI output of an emulator beats the crappy signal of real hardware whether its RGB, composite, or whatever else you are using, hands down, any day of the week. You just can't beat native HDMI output with anything else. It can't be done.

I'm also very skepitcal of the claim that bloom, color bleed, and crappy video signal were taken into account when designing the graphics. These would have varied so much between individual TV sets that it would have been impossible to know how they would have affected the image for "the average user." There may be particular Genesis games that were an exception - different artists employed different tricks. But for the most part, native HDMI beats any other signal you can use.

There's an entire website dedicated to this stuff, the neogaf forums and the shmup forums. Fudoh and the other die hard video nerds will tell you the same thing I've been saying. Someone better go tell those guys that they're wasting hundreds of dollars buying upscalers like the XRGB-Mini

Also, I wonder how an "innacuracy" that can only be noticed by someone who has played the game hundreds of times can possible detract from the gameplay or overall experience?



-------------------------
My backlog / games completed
 


Edited: 04/30/2015 at 03:20 PM by PatrickM.

Apr 30, 2015 at 3:26:21 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: PatrickM.
You must be using a piss-poor LCD or else haven't properly calibrated it. On a properly calibrated LCD the results are amazing, superior to real hardware.

I'm also very skepitcal of the claim that bloom, color bleed, and crappy video signal were taken into account when designing the graphics. These would have varied so much between individual TV sets that it would have been impossible to know how they would have affected the image for "the average user." There may be particular Genesis games that were an exception - different artists employed different tricks. But for the most part, native HDMI beats any other signal you can use.

There's an entire website dedicated to this stuff, the neogaf forums and the shmup forums. Fudoh and the other die hard video nerds will tell you the same thing I've been saying.
I've had many discussions with Fudoh under a different user name on the shmup forums during my XRGB testing days. He's very well aware of the fact that LCD's have poor contrast even though he prefer's to use them. In fact, when I had my favorite crt ISF calibrated by Chad Bilheimer (famous ISF guy on the east coast - http://hdtvbychadb.com -), he too admitted that LCD's are junk in comparison to CRT's and Plasma's. Its a well authenticated fact beyond debate. Noone, and I mean noone can change the fact that LCD's have poor blacks and terrible motion blur. Even the full array local dimming ones have contrast issues due to blooming from lit zones vs. non-lit zones. Chad told me that the sooner the LCD's die, the better. And he's calibrated thousands of them. Again, he's ISF certified and no stranger to them. I've had them, and sold them/returned them in the past. They suck, and nooene can deceive my eyes into believing otherwise. If the ISF guy agree's with me, then I'm in good company and have no need to be convinced otherwise.

And its also a fact that those old games (maybe not all, but most) were developed with the knowledge that most people were using composite/rf. Just look at the vertical "l" dithering in that's used to create fake transparancies in Genesis games. Or the "X" or checkerboard dithering that uses to created additional colors in tons of other games on various consoles. Older TV's from the 80's early 90's did not have terrific comb filters, and this results in poor color separation unlike newers sets with better comb filters. And thus, dithering would ALWAYS blend to create new colors on virtually all older sets. Its a fact, it does not need debating. HDMI is superior "technically" yes, but then again, modern 3D polygons are superior "technically" to old 2D pixels on the Saturn. But rarely do the modern graphics look more beautiful. Again, if you prefer HDMI, then why try to convert me over? Enjoy it and don't worry about what others think.

Again, I only wanted to help. I am NOT interested in debating (arguing). My purpose was to inform and dispel some myths. Your not going to convert me over to emulators and HDMI-XRGB-whatever. I've went full circle, from CRT, to LCD/XRGB/Emulator, and back to CRT. And I have damn good reasons for going back to the CRT. My initial post was not directed toward you, but many other comments in the thread. Don't take it personally. Ok? Enjoy.



-------------------------
What happens when we die?

www.truthaboutdeath.com


Edited: 04/30/2015 at 03:30 PM by Kyle_Blackthorne

Apr 30, 2015 at 3:33:06 PM
Ozzy_98 (8)
avatar
< Bowser >
Posts: 6369 - Joined: 12/11/2013
Ohio
Profile
Originally posted by: PatrickM.

Originally posted by: Kyle_Blackthorne

I'm also very skepitcal of the claim that bloom, color bleed, and crappy video signal were taken into account when designing the graphics. These would have varied so much between individual TV sets that it would have been impossible to know how they would have affected the image for "the average user." There may be particular Genesis games that were an exception - different artists employed different tricks. But for the most part, native HDMI beats any other signal you can use.
 

It was. Generally what you do is checkerboard a patern, then next frame, invert it, or just show it one frame, then don't the next.  Most CRTs of the day couldn't show a ful 60 FPS so it was transparent.  But there is no right or wrong way, because there is no standard.  Thats why diffrent nes games for example all had diffrent amunts of viewable space.  They aimed for the highest demographic there was.  My old CRT for example didn't really show scanlines, so to me, people wanting black lines in their picture seems silly.


Apr 30, 2015 at 3:46:04 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: Ozzy_98

My old CRT for example didn't really show scanlines, so to me, people wanting black lines in their picture seems silly.
 

Yea some really small CRT's had really poor focus and would blend the rasters together. Also some CRT's were worn out badly and thus had poor focus also. But factory fresh, or decent CRT's always had dark spaces between scanlines. And because games were designed around them, they look wrong without them. But alot of people will never know that until they experience the games as intended (ie; on a CRT with good focus).


-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 3:53:17 PM
bunnyboy (81)
avatar
(Funktastic B) < Master Higgins >
Posts: 7704 - Joined: 02/28/2007
California
Profile
Originally posted by: Kyle_Blackthorne

1. Record (in 60fps) the time difference between your finger slamming into the button, and the action occurring on screen. Playback the video on your PC in a program that can break down the action in individual frames (ie; watch how long it takes for Mario to jump after you hit the A button).
Recording at 60 fps can't get 1 frame resolution because of nyquist.  That applies at both ends, so a ~1 frame actual delay can show up as ~3 frames on your test.  Might still end up with similar results but it certainly isn't an accurate test.

 

Apr 30, 2015 at 3:56:41 PM
Ozzy_98 (8)
avatar
< Bowser >
Posts: 6369 - Joined: 12/11/2013
Ohio
Profile
Originally posted by: Kyle_Blackthorne

Originally posted by: Ozzy_98

My old CRT for example didn't really show scanlines, so to me, people wanting black lines in their picture seems silly.
 

Yea some really small CRT's had really poor focus and would blend the rasters together. Also some CRT's were worn out badly and thus had poor focus also. But factory fresh, or decent CRT's always had dark spaces between scanlines. And because games were designed around them, they look wrong without them. But alot of people will never know that until they experience the games as intended (ie; on a CRT with good focus).
 


Actually it was a brand new 24" CRT, got in 5th grade (I was born in 1978 and I'm not doing the math; I have 4 mins left on the clock and I'm not thinking for the rest of the day).  I forget who made it, it's at my dad's house as a spare TV.  The shadow mask on it's very fine which helped, but the main thing is it ooks like it repeated the unused lines.  240p isn't really 240p, not the same way there's a 480p.  240p is still the same vertical size as 480p, it just skips every other line.  Not all TVs processed these the same way, and while most older CRTs skipped a line, some of them doubled the displaye lines, same way newer TVs do it.  In the case of this TV, you had a slight line between each scan line and pixel, but no darkened line where it skipped over the image.     And it would be fucking amazing if the enter key woul start working for me damnit.

Apr 30, 2015 at 4:07:24 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: bunnyboy

Originally posted by: Kyle_Blackthorne

1. Record (in 60fps) the time difference between your finger slamming into the button, and the action occurring on screen. Playback the video on your PC in a program that can break down the action in individual frames (ie; watch how long it takes for Mario to jump after you hit the A button).
Recording at 60 fps can't get 1 frame resolution because of nyquist.  That applies at both ends, so a ~1 frame actual delay can show up as ~3 frames on your test.  Might still end up with similar results but it certainly isn't an accurate test.

 
That's where the audio method can work then. Ether way, you can find out the difference between the two. The audio method works extremely well in determing the gap between the two sources. In fact, the audio method might be superior because it doesn't rely on frames. You can exact differences between the two sources.



-------------------------
What happens when we die?

www.truthaboutdeath.com


Edited: 04/30/2015 at 04:11 PM by Kyle_Blackthorne

Apr 30, 2015 at 4:17:17 PM
Kyle_Blackthorne (1)

< Tourian Tourist >
Posts: 38 - Joined: 06/18/2013
Alabama
Profile
Originally posted by: Ozzy_98

Originally posted by: Kyle_Blackthorne

Originally posted by: Ozzy_98

My old CRT for example didn't really show scanlines, so to me, people wanting black lines in their picture seems silly.
 

Yea some really small CRT's had really poor focus and would blend the rasters together. Also some CRT's were worn out badly and thus had poor focus also. But factory fresh, or decent CRT's always had dark spaces between scanlines. And because games were designed around them, they look wrong without them. But alot of people will never know that until they experience the games as intended (ie; on a CRT with good focus).
 


Actually it was a brand new 24" CRT, got in 5th grade (I was born in 1978 and I'm not doing the math; I have 4 mins left on the clock and I'm not thinking for the rest of the day).  I forget who made it, it's at my dad's house as a spare TV.  The shadow mask on it's very fine which helped, but the main thing is it ooks like it repeated the unused lines.  240p isn't really 240p, not the same way there's a 480p.  240p is still the same vertical size as 480p, it just skips every other line.  Not all TVs processed these the same way, and while most older CRTs skipped a line, some of them doubled the displaye lines, same way newer TVs do it.  In the case of this TV, you had a slight line between each scan line and pixel, but no darkened line where it skipped over the image.     And it would be fucking amazing if the enter key woul start working for me damnit.

Yea thing is, high quality CRT's had those black lines. And high quality CRT's is what was used in development studios and thus that's why the retro games look better with them than without (ie; they were designed with them, thus they are a part of the experience). The standard monitors used were professional monitors. And I've never seen a professional monitor without some kind of gap between lines. But hey, if you don't like them, then well, that's your business and none of mine. Haha.

-------------------------
What happens when we die?

www.truthaboutdeath.com

Apr 30, 2015 at 4:19:20 PM
bunnyboy (81)
avatar
(Funktastic B) < Master Higgins >
Posts: 7704 - Joined: 02/28/2007
California
Profile
But then you are measuring audio latency, not video latency. A 100ms audio buffer sounds fine to almost everyone, but a 100ms video buffer would be noticeable to almost everyone.