Hi,

I'd like to find a formula how to calculate the transfer time needed to transfer a disk. But there's something I can't understand...

Theoretically, a disk transferred from Amiga to PC with 115200 bps SHOULD be transferred in 7.83 seconds, no?

The equation would be like:

115200 bytes / 1 sec = 901120 bytes / x sec

This would result in a transfer time of 7.83 seconds for x!

This would be a dream, but isn't realistic!

Where's the "bug" here?