Originally Posted by DDNI
impressed as I am with these new cards, I am amused how important the often dismissed "Meaningless Indicator of Processor Speed"
unit of measure has become so very
Correct, it has become
unimportant on 2010 PCs. Because of:
- PC OSes rely on heavy background access of secondary storage
- memory is much slower than the CPU
- Onto the OS is a huge number of background services, drivers, and managers to do all the extra things a PC OS must do these days.
On Amiga (at least on OS<3.9, I don't know the others):
- memory is much less of a bottleneck, at least for < 68040
- partly due to allocate/load/use/free paradigm
- OS interferes as little as possible
- "no" background harddisk accesses or heavy tasks
This means that what's making MIPS less meaningful on eg. PC is two bottlenecks, memory and harddisk, but since the OS doesn't need it, those are gone on Amiga. Let's say OS3.1 runs decently with 5 MIPS. Doubling the MIPS basically doubles the speed in everything, even maxing out some programs so that you have 'instant response'.
Also, the MIPS levels available for Amiga are all well below 1% of current CPUs, which means that for even moderately complex programs such as a word processor, every MIPS counts. A better example is an assembler or compiler, where compilation speed can be 15, 30, 120 seconds even with the fastest Motorola CPU available. Ie. it will never max out.
The same is basically true for all portable devices such as cameras, phones or gaming handhelds. More MIPS in your ARM instantly makes them more responsive, or even capable of performing heavy tasks, such as running emulators or browsers.