Standard practice has been to look at a char in the kernel that marks a console as being NTSC-J (char: I), NTSC-U (char: A), or PAL (char: E).
The practice is then to initiate the GPU with 60Hz for NTSC-J and NTSC-U, and with 50Hz for PAL.
Code: Select all
// SCEx string address in the BIOS (we *must* do a !='E' check or else a Japanese PSX will treat it as PAL)
if (*(char *)0xBFC7FF52!='E')
{
// NTSC
}
else
{
// PAL
}
- this is what the user actually wants
- this is an unmodified console
These days though, there are several problems with that.
- Many modern displays don't work well with 50Hz sources, especially when considering upscalers and line doublers.
- Modern displays are often built for a native 60Hz refresh rate, and using 50Hz can put them in a degenerate display mode where scrolling isn't smooth and the active video area is reduced
- Quite a few consoles have swapped BIOS chips, often making an NTSC-J system report to be PAL
- Quite a few setups are using RGB (and not CVBS or S-Video). These setups have a clear choice between 50Hz or 60Hz, as color decoding isn't needed.
If we look at classic CRT monitors, those in the PAL region typically supports 50Hz and 60Hz RGB.
And if the CRT was made after ~1990 or so, it typically also supports NTSC color decoding.
So given this, I want to propose developing a new method to detect the system hardware (specifically the GPU clock speed).
And, using this method, I want to propose making the decision more often to default to NTSC timings on supposed (and real) PAL consoles.
If that new system detection method is successful, it would be great to include it as a library call in homebrew SDKs such as psxsdk or PSn00bSDK.