I was thinking about framerate today.
Exactly what framerate did the NES operate at?
Okay, the short answer is probably going to be 30 fps or 29.97 fps (assuming we are talking about the NTSC version) but I'm really interested in a lot of gooey details that come along with this subject.
The way frames were handled back then was a whole different world from today. Today games can drop frames but operate at the same speed, or even gain extra frames if they are not capped. But I understand they didn't have the same timing mechanisms back then. When there were too many objects on the screen, the game slowed down. (I remember that.) So, what is going on when that happens? I would guess the system is just sending the same image to the TV until it gets a new one, but I wonder if there are more details to it than that.
And what about on the flip side? What happens when a frame gets rendered with time to spare?
And how does a game handle differently between NTSC and PAL?
I recall a while back I was trying to reverse-engineer Mega Man's mechanics to try to make a platformer I was making feel more comfortable. I noticed that no object ever seemed to go faster than 16 pixels per frame. This made sense, since the game would presumably not track an objects path but just what it is colliding with on a given frame. Thus, if the player fell faster than 16 pixels per frame, the player could potentially fall right past an entire block, falling right through it.
But how do per-frame calculations work when the system is outputting a different framerate?
And if I am mistaken in my assumptions about how a computer processes data between rendering frames, please elucidate me.
Exactly what framerate did the NES operate at?
Okay, the short answer is probably going to be 30 fps or 29.97 fps (assuming we are talking about the NTSC version) but I'm really interested in a lot of gooey details that come along with this subject.
The way frames were handled back then was a whole different world from today. Today games can drop frames but operate at the same speed, or even gain extra frames if they are not capped. But I understand they didn't have the same timing mechanisms back then. When there were too many objects on the screen, the game slowed down. (I remember that.) So, what is going on when that happens? I would guess the system is just sending the same image to the TV until it gets a new one, but I wonder if there are more details to it than that.
And what about on the flip side? What happens when a frame gets rendered with time to spare?
And how does a game handle differently between NTSC and PAL?
I recall a while back I was trying to reverse-engineer Mega Man's mechanics to try to make a platformer I was making feel more comfortable. I noticed that no object ever seemed to go faster than 16 pixels per frame. This made sense, since the game would presumably not track an objects path but just what it is colliding with on a given frame. Thus, if the player fell faster than 16 pixels per frame, the player could potentially fall right past an entire block, falling right through it.
But how do per-frame calculations work when the system is outputting a different framerate?
And if I am mistaken in my assumptions about how a computer processes data between rendering frames, please elucidate me.