How is the displayable area determined for programmed modes? I checked various ECS modes (the monitor drivers use slightly different settings on AGA machines, maybe something related to fetch mode?).
It seems that HTOTAL + 1  (HBSTOP  HBSTART) gives the number of displayable colour clocks (so multiply that by 8 to get number of 35ns pixels). At least for all modes I checked except EURO36.
Calculating that for the Debian mode (HTOTAL=0071, HBSTRT=0001, HBSTOP=0023):
71 + 1  (23  1) = $50. $50 × 8 = 640
(At the moment WinUAE shows 638 pixels for that mode I think.)
By "max horizontal overscan" I mean the largest width you can set using Overscan prefs.
EURO72 (max horizontal overscan 680): HTOTAL=0071, HBSTRT=0001, HBSTOP=001E
HTOTAL + 1  (HBSTOP  HBSTART) = 71 + 1  (1E  1) = 72  1D = $55. $55 × 8 = 680
EURO36 (max horizontal overscan 720): HTOTAL=00E2, HBSTRT=0008, HBSTOP=002F
HTOTAL + 1  (HBSTOP  HBSTART) = E2 + 1  (2F  8) = $BC. $BC × 4 = 752
For EURO36 the maximum selectable overscan width is 32 fewer than that.
Multiscan (max horizontal overscan 680): HTOTAL=0071, HBSTRT=0001, HBSTOP=001E
HTOTAL + 1  (HBSTOP  HBSTART) = 71 + 1  (1E  1) = $55. $55 × 8 = 680
SUPER72 (max horizontal overscan 936): HTOTAL=0091, HBSTRT=0001, HBSTOP=001E
HTOTAL + 1  (HBSTOP  HBSTART) = 91 + 1  (1E  1) = $75. $75 × 8 = 936
DBLNTSC and DBLPAL (max horizontal overscan 744): HTOTAL=0079, HBSTRT=0001, HBSTOP=001E
HTOTAL + 1  (HBSTOP  HBSTART) = 79 + 1  (1E  1) = $5D. $5D × 8 = 744
Example of different register values used on AGA machines by the DBLNTSC monitor driver:
DBLNTSC: HTOTAL=0081, HBSTRT=0001, HBSTOP=0021
HTOTAL + 1  (HBSTOP  HBSTART) = 81 + 1  (21  1) = $62. $63 × 8 = 784
Max selectable AGA horizontal overscan is 720; 64 shres pixels (= 8 colour clocks) less. Is the difference related to 4x fetch mode and/or sprites?
