It was hard to find the right forum for this question!
On real hardware and in WinUAE, in SysInfo 3.24, an A500 gets a score of 1.03 vs A600 with Kickstart 1.3, and a score of 1.00 vs A600 with Kickstart 3.1 (1.00 being the result you'd expect originally, but there it is, 1.03 and SysInfo 3.24 was made before Kickstart 3.1) .
It's likely that 3.1 does more than 1.3, and that the benefit leaves less processing time for applications,. That it happens to be 1.00 on Kickstart 3.1 on real hardware is likely just a number; pure coincidence.
Out of curiosity I'd like to ask those who write OS applications who have had to deal with performance if they've noticed this and what it may be due to specifically.
And if it is due to 3.1, what would I turn off, if I would like my OS application to perform maximally without completely turning off the OS as is done in demos and games.