You know, I've been doing stuff on the SNES (as you're probably aware) and as my code gets longer, I kind off want to know how hard the hardware is being pushed so I can get an idea as to how fast the SNES is. Is there a way to easily see how close the SNES is to not being able to complete a frame at 60 fps?
On the NES, it is common to use one of the bits in PPUMASK (monochrome, color emphasis, left-edge clipping) to mark where processing for the frame is done.
On the SNES ... it looks like you should be able to use the brightness control in
INIDISP for the same effect. (I'm certain there are many options, that's just the first one that looks suitable)
Respectfully: INIDISP (specifically screen brightness) I don't think will be very helpful. You really need a selection of distinct colours that represent each of your routines (blue for screen layout building, red for VRAM DMA, green for sprite building, purple for sprite DMA, yellow for sound register updates, etc.).
I would suggest using the static colour addition/subtraction registers (CGWSEL and CGADSUB) with a fixed colour of your choice (maybe COLDATA can help? Been a while).
Or how about changing palette colour #0 (usually used for background colour somewhere on-screen -- just make sure it's visible somewhere on the screen, from the very top all the way to the bottom; maybe set the entire right side of the screen's tiles to a specific tile that contains nothing but the background colour)?
If you want to get fancy, you could
read from SLHV ($2137) to load the current scanline into OPVCT ($213D), and then display a CPU usage graph. If you start exceeding 192 lines too often, you're in danger of slowdown.
tepples wrote:
If you want to get fancy, you could
read from SLHV ($2137) to load the current scanline into OPVCT ($213D), and then display a CPU usage graph. If you start exceeding 192 lines too often, you're in danger of slowdown.
Could that be what's used here(little arrow going up and down to the left)?
https://www.youtube.com/watch?v=JBXBXEAHMws
At around 2:00 in the video, it says that it's a CPU usage meter. Unless the bottom is the most effort and the top is the least, I don't see how a simple RPG game like that is nearly maxing out the SNES in some areas it seems. Also, how does the page turning effect possibly use all 8 HDMA channels? it just appears to use one window, changing the palette of the page every couple of scanlines, and messing with BG perimeters.
I think he used 1 background layer, and real-time software sprite clipping to fake BG layer priority.
psycopathicteen wrote:
I think he used 1 background layer, and real-time software sprite clipping to fake BG layer priority.
You know, that reminds me of something. I always thought it would be cool to make a game using Mode 4 and having the BG be the 256 color one and having the 2bpp one underneath it and messing with the priority so that sprites are on top of BG1 but below BG2 and BG2 is above sprites, but under BG1 so it looks like there are 2 256 color layers. This would work fine in a game with an overhead view.
Updating the overscan border color was a common technique for console developers to measure processor resource.
I could swear I saw a SNES game with a debug feature that included some kind of CPU usage meter. The border/overscan color sounds like a nice idea since you could actually color code it for different tasks. But it also might be just as useful in development to have the game keep track of time with SLHV/scanline read back. You could even measure the length of time a particular section of code takes by seeing both the entering scanline and exiting scanline numbers.
For what Espozo wants I think the easiest thing to implement might be manipulating the brightness control. Maybe set brightness to only 50% and when frame processing completes set it to 100% or reverse the brighter and darker area to your preference.
For a less intrusive measure you might want the scanline number and a simple counter drawn on screen with sprites.
With my SNES NSF Player while developing it, I used the horizontal IRQ triggering before hblank, where I change the background color. During development I was watching how long the init and play routines were taking. Later on, and the form it was released in, I had it doing that only while data was being decompressed, displaying data from the buffer. It slowed down the unpacking slightly, but I thought it looked kinda cool, and was probably sorta inspired by some of the "packed" Atari ST games I was playing at the time.
https://www.youtube.com/watch?v=cHPgbuM3Yo8It looks like my code for that was posted before:
http://forums.nesdev.com/viewtopic.php?t=6956
I'll second Memblers' methodology (it mirrors what I said above: re: changing colour #0 (background colour)).
The reason INIDISP (brightness) won't work is because your eyes aren't going to be able to easily tell (or track) the difference between 25% , 50%, 75%, and 100% brightness (I just picked 4 grades there, despite the SNES having 16). All you'd get are "darker areas", but it'd be hard for you to tell which area (which brightness, visually) represented what piece of code. Plain and simple: it's not an effective way to visualise measurement of something in this case. Using specific colours alleviates all of that -- you know what routine is represented by red, which uses green, which uses magenta, etc..
I think it's been implied that INIDISP is in TV-gamma brightness, and human perception of brightness is logarithmic, so I think the difference between INIDISP of 15 and 11 would be the same as 11 to 8; 8 to 6; 4; 3; 2; 1.
Of course, humans are also really bad at perceiving relative brightness, so it's doubtful that a single stop (halving of brightness) would be enough, and you might want to skip steps.
koitsu: I think the idea here is just to measure total CPU time, in which case you're only measuring one thing. Sure, it'd be possibly more useful to measure every part, but you'll run out of easily distinguishable colors quickly as well. (you could also do strips of varying height so you still get multiple measurements but using only two values)
psycopathicteen wrote:
I think he used 1 background layer, and real-time software sprite clipping to fake BG layer priority.
Well, that'd explain it, but is it needed in battle as well? It sure doesn't seem so, yet that's where the CPU meter jumps around the most. I can't help but think it's buggy.
Nah, you got enough colours for most things. Stick with the "mains" (I want to say primaries but Tepples will rape my face): red, green, blue, yellow, magenta, cyan, and white. That's 7. On the IIGS, using the border methodology, those were what we used (along with black) -- eight was more than enough.
The thought process goes like this: pick a few (say 4 -- using red, green, blue, and white) main areas you want to time. Implement them using the aforementioned method, each one having their own colour, where white is for "everything else". Run test -- if R/G/B look about right (i.e. are not taking up large "sections" of the screen, meaning the programmer can say "yeah I guess that's about right"), then you check those off and focus on the other areas of the code (the "white" areas), setting some of those to R/G/B, rinse lather repeat until you've narrowed it down.
Eventually you optimise things to where all/most of your CPU time is spent in all of the routines combined/as an aggregate (i.e. very little "white" left) -- that's where you start having to make design decisions or reengineer things for speed.
Anyway, just sharing how we did it "back in the day". Really not rocket science. But brightness really wouldn't provide enough granularity, IMO.
MottZilla wrote:
I could swear I saw a SNES game with a debug feature that included some kind of CPU usage meter.
There are probably several, but Final Fight has a debug mode with this CPU meter.
Esposo: since CRT TVs draw the image from top-left to bottom-right in a zigzag ("raster scan") fashion, a higher-up "timing" arrow or BG colour change means LESS CPU time used; and a colour change, etc. seen near the bottom of the screen means MORE CPU time is being used.
ccovell wrote:
Esposo: since CRT TVs draw the image from top-left to bottom-right in a zigzag ("raster scan") fashion, a higher-up "timing" arrow or BG colour change means LESS CPU time used; and a colour change, etc. seen near the bottom of the screen means MORE CPU time is being used.
I figured as much, but I wasn't sure. (when I first started SNES dev, I was severally confused when I added numbers to the BG1 y position register and the screen started to move down instead of up.
)