There's not a lot of hard information out there, so I made a few guesses. If anyone has better, I'd love to hear it, because I couldn't find it anywhere.
Fact: PSX GPU has a 53.203 MHz reference clock
Source:
http://www.mediafire.com/?zbp2bhb4y117xp7
Fact: PSX GPU has 240 and 480 line heights, and 256/320/384/512/640 line widths
Source:
http://www.horningabout.com/jimb/psx/gpu.txt
Fact: The NTSC square pixel clock (single height) is 12,306,394Hz.
Source:
http://lipas.uwasa.fi/~f76998/video/conversion/
Guess: Dot clocks are integer dividers off the reference clock.
Justification: It would be simple and easy to implement.
Guess: Dot clock dividers were chosen to give screen aspect ratios of approximately 4:3 with the usual vertical overscan (approximately 448 of 480 lines visible).
Justification: There'd be no reason to have lots of overscanned horizontal pixels; developers wouldn't be able to use them. A large underscan would also mean useless black pillarboxes on most TVs.
If you accept all of these assumptions, the results are easy to calculate:
Base CLK = 53.203MHz
W Divider PAR (240p) DAR (H=224)
256 10 1.157 1.322
320 8 0.925 1.322
384 7 0.810 1.388
512 5 0.578 1.322
640 4 0.463 1.322
For 480i resolutions, you just double the pixel AR. Then you use that pixel AR with whatever actual width/height the engine requested to get the display AR. Example: 256x240 has AR=1.234
Note that the extra width of the 384 resolution (5% more than the others) can easily be observed in practice.
There's additional evidence supporting these dividers in the PS2: It has access to the same video resolutions, and the MAGH register values correspond to these dividers exactly. Without more understanding of how the clock select works though, it's not proof of anything.