NTSC vs PAL

Страница 1/4
| 2 | 3 | 4

By eimaster

Champion (285)

Аватар пользователя eimaster

07-12-2017, 07:53

Hi
I have a middle east Arabic PAL MSX which run at 50Hz. I noticed that Japanese NTSC MSX is a little faster when running and that can be easily noticed when listening to games music. Does this mean that Japanese NTSC MSX's CPU runs at more clock speed than PAL MSX's CPU which is about 3.57MHz/s? And if both NTSC & PAL MSX CPU clock speed is the same, then why do the running speed of PAL & NTSC MSX vary if it only effect the video output? Please explain in details if possible.

Для того, чтобы оставить комментарий, необходимо регистрация или !login

By PingPong

Enlighted (4155)

Аватар пользователя PingPong

07-12-2017, 08:17

Not more speedy. It is only the vblack interrupt that is higher.
But if music tempo is measured using the interrupt frequency then the music play faster on ntsc

By meits

Scribe (6571)

Аватар пользователя meits

07-12-2017, 11:03

The same goes for NES and Mastersystem game console as well.
The music player of a program gets new music data interrupt. If there are 60 interrupts per second, it will reach the end of the track sooner than when there are 50 interrupts per second.

It's beyond me why the amount of interrupts per second is coupled on the electricity system of a region though.

By Grauw

Ascended (10818)

Аватар пользователя Grauw

07-12-2017, 13:16

The CPU is the same speed, but the display refresh rate is different.

To avoid screen tearing, games commonly synchronise to the display refresh, so the game "updates" every 1/60th of a second instead of every 1/50th, making character movement go faster if at every update it moves 1 or 2 pixels, or your music play faster if every update the replayer advances one step in the song data.

However since the CPU is the same speed, it also means the game has less time to do its updates at 60 Hz than it does at 50 Hz. In games made for 60 Hz running at 50 Hz usually the CPU sits idle for 20% of the time. Conversely, games which are made for 50 Hz and which utilise the full CPU can not really be run properly at 60 Hz.

That’s also why when developing a game, it’s best to make it targeted for 60 Hz, then it will be much easier to get it to work at both frequencies.

As for the relation between mains power frequency vs. the display refresh rate, I would direct you to this StackExchange question.

By meits

Scribe (6571)

Аватар пользователя meits

07-12-2017, 13:36

I do understand why display refresh rate needs to be the same as the frequency of the power line. Things can go wobly. I've seen it happen and feared my monitor was broken. Turned out to be an electric clock I placed on top of it Big smile
But does the display frequency have to be the same as the amount of interrupts per second? If so, why? Just cuz it's convenient? If I change the frequency of my pc screen (I think) that's the only thing changing.

By tfh

Prophet (3421)

Аватар пользователя tfh

07-12-2017, 13:46

Meits wrote:

I do understand why display refresh rate needs to be the same as the frequency of the power line. Things can go wobly. I've seen it happen and feared my monitor was broken. Turned out to be an electric clock I placed on top of it Big smile
But does the display frequency have to be the same as the amount of interrupts per second? If so, why? Just cuz it's convenient? If I change the frequency of my pc screen (I think) that's the only thing changing.

As long as you make sure that the frequency of your monitor is the same (or a multiple) from the number of frames from the image-source there is no problem. If there is a difference between the two, you will experience tearing, because the image is updated while being drawn.

By tvalenca

Paladin (747)

Аватар пользователя tvalenca

07-12-2017, 14:38

I think the whole point of VDP interrupting CPU is to tell: "Hey CPU, I'm going to update the screen! if you have any changes to the graphics I'm showing, submit them ASAP!".

So, if VDP refreshes screen 50 times per second, it will interrupt CPU at the same rate.

By syn

Prophet (2133)

Аватар пользователя syn

07-12-2017, 15:14

Try play a PC game with V-Sync turned off, and variable framerate.

You will often notice screen tearing, a stripe somewhere on screen.

Im not sure if this is 100% correct but from my understanding it is like this: the monitor takes the "screen" info it gets from the source (through your scart/vga/whatever) and puts that onto the screen, from top to bottom one line at the time. This whole buildup represents the refreshrate of the monitor. So for MSX monitors/old tvs etc this is happens either 50 or 60 times a second.

If your computer (game) runs at a higher or lower frame rate than your monitor and doesn't sync to the refreshrate of the monitor, it may happen that the "image" that is supposed to be on screen, changes while the monitor is not done with the build up. So the upper half (above the screen tear) is your "old" image" or last "frame", while the bottom part is the new image/current frame.

Therefor in pc games, v-sync is preferable turned on in most cases. It basically means the cpu or the game waits until the monitor is done "building up" the screen.

By Grauw

Ascended (10818)

Аватар пользователя Grauw

07-12-2017, 15:26

Indeed. It is especially visible when there is a smooth scroll. Scrolling at 50 px/second while the refresh rate is 60 Hz will look very ugly, a lot of tearing and/ or stuttering.

Additionally, to make the game run at a consistent speed, so not slower when there are a lot of enemies or super fast if there are nine, you want to synchronise it to a timer. The music must be timed as well because otherwise the tempo will vary which sounds horrible (see Xain games). And the display sync timer at 50 / 60 Hz is the only proper timer that is available on MSX1. On MSX2/2+ the alternative (setting line ints at intervals) is very cumbersome and still does not solve the tearing problem, VGMPlay is the only software where I've seen that being used.

By TomH

Champion (375)

Аватар пользователя TomH

07-12-2017, 15:45

Grauw wrote:

As for the relation between mains power frequency vs. the display refresh rate, I would direct you to this StackExchange question.

I've also heard that it's to avoid too much visible strobing from studio lighting — that pulses slightly with the AC current, so you're trying to avoid creating a beat frequency through aliasing.

Because I think some of the posters above are assuming quite a lot of knowledge about what an interrupt is, I'm going to restate:

In an MSX the processor goes at the same rate regardless of the TV or monitor output standard. But processors support interrupts. An interrupt is a signal from an external piece of hardware that alerts the processor to an event. When the processor receives an interrupt it pauses what it is currently doing, does something related to the interrupt, then returns to whatever it was doing before. So in the abstract an interrupt could be a printer saying that it has finished the last batch of data (so the processor should send some more), or a keyboard announcing that a key has been pressed (so the processor should update the input buffer), or a floppy disk announcing that it has finished a seek (so the processor could initiate a read now), or anything else.

In a standard MSX there is only one thing that generates interrupts: the video display processor. Its job is to convert the current frame into a serial stream of video, so the only thing it really knows of potential interest to the CPU is current stream position. In an MSX 1 the only event it announces is that it has finished the pixels for this frame and entered into the single-colour bottom border. So when producing a 50Hz display that interrupt will occur 50 times a second. When producing a 60Hz display that interrupt will occur 60 times a second.

Being the only interrupt available, the end-of-visible-area interrupt is used for a bunch of unrelated things in an MSX: for example, because the MSX's keyboard doesn't generate interrupts it's cue used by the BIOS to scan for key changes.

In games you generally want your music just to be playing, without having constantly to worry about it throughout your game code. The sound chip can hold tones without processor intervention so playing a constant sound requires no effort. So the smart thing to do is to update which tones the sound chip is playing in the bit of code that is triggered by the display interrupt.

If you just step through the same list of tonal changes on every interrupt then you end up with the same music playing at a different tempo on 50Hz and 60Hz machines. Not because of CPU speed differences but just because that one job is timed by the display interrupt.

An arbitrary computer need not have any interrupts tied to the display output. A lot of machines, including a few of the MSX's contemporaries, have a timer that can produce interrupts, in which case it's easy to perform background tasks like music playback at an output-independent rate. An MSX 2 also allows the programmer to schedule a more complicated display-linked sequence of interrupts, so you could probably invent something very close to a 60Hz sequence of interrupts on a 50Hz MSX 2 and vice versa. But it'd be a lot of work when you can just play the music at a slightly different tempo and, anyway, who's ever going to know? Europeans who go on holiday to Japan and happen to play the same video games while they're there?

By Manuel

Ascended (19674)

Аватар пользователя Manuel

07-12-2017, 22:35

Would the mains frequency be exactly the same as the interrupt frequency? I doubt that. The interrupt frequency is an exact divisor of the CPU frequency. And I doubt that one is directly coupled to the mains frequency.

Страница 1/4
| 2 | 3 | 4