Hi,
According to the wiki, DMC DMA cycles are suggested to be 4 CPU clocks long (or less), with the first 3 clocks being just re-reads of the presently executing CPU clock (due to possible interrupt stack writing behavior which cannot be stopped on the 2A03's 6502), and then the last clock being the actual DMC DMA byte fetch.
The wiki also mentions how this DMA operation will cause false reads on hardware streaming ports such as $4016/7, $2007, and even $2002 on a coincidental execute of a 6502 opcode that makes a load from those addresses, during a DMC DMA fetch.
I've always noticed how many games would stop all game animation just to play back some DMC samples (RARE games in particular), but noticed that apparently Nintendo and especially Konami games never suffered the drawback of the DMC DMA unit's side effects on say the reading of the controller ports (i.e., $4016).
How did Konami master the use and/or timing of the DMC unit, so as to not cause any disturbance on reads from the controller ports (there's a ton of samples played during in-game action)? Did they tie their DMC sample init code in tightly with the controller reading? And if so, how did they avoid a coincidental DMA fetch for a sample lasting over several frames with $4016/7 reads? Or, did they use very specific DMA frequencies that would somehow work across multiple video frames? Were they just lucky, or just really good NES programmers (I tend to believe the latter)?
Just curious is all, as it seems the DMC was too much of a headache for many commercial developers to bother trying to use for their games (I can understand why
According to the wiki, DMC DMA cycles are suggested to be 4 CPU clocks long (or less), with the first 3 clocks being just re-reads of the presently executing CPU clock (due to possible interrupt stack writing behavior which cannot be stopped on the 2A03's 6502), and then the last clock being the actual DMC DMA byte fetch.
The wiki also mentions how this DMA operation will cause false reads on hardware streaming ports such as $4016/7, $2007, and even $2002 on a coincidental execute of a 6502 opcode that makes a load from those addresses, during a DMC DMA fetch.
I've always noticed how many games would stop all game animation just to play back some DMC samples (RARE games in particular), but noticed that apparently Nintendo and especially Konami games never suffered the drawback of the DMC DMA unit's side effects on say the reading of the controller ports (i.e., $4016).
How did Konami master the use and/or timing of the DMC unit, so as to not cause any disturbance on reads from the controller ports (there's a ton of samples played during in-game action)? Did they tie their DMC sample init code in tightly with the controller reading? And if so, how did they avoid a coincidental DMA fetch for a sample lasting over several frames with $4016/7 reads? Or, did they use very specific DMA frequencies that would somehow work across multiple video frames? Were they just lucky, or just really good NES programmers (I tend to believe the latter)?
Just curious is all, as it seems the DMC was too much of a headache for many commercial developers to bother trying to use for their games (I can understand why