Quote:
Originally Posted by Gorf
Therefore the whole test setting is biased:
To really make a useful comparison, you need an analog audio wave a staring point.
Each machine must be allowed to sample and playback that tone at a sample frequency that best matches it’s inner workings.
|
Please don't me call stupid. Please look at the test programs I provided. It does adjust to "the starting point" (actually, that is called "the phase"
, and it does adjust to the frequency (actually, by including a phase drift). It also adjusts to the amplitude. Continuously following the phase is identical to adjusting the frequency.
The way how this works is that it fits a sine wave in double precision to the input, by continuously fitting the amplitude and the phase. Of course, if you can measure better, and provide improvements to the program, you are welcome.
Requesting a 16bit 44.1kHz output is actually nothing "unusual" if someone claims "CD quality audio" on the Amiga. CD quality is exactly that. (And not some other frequency). But, as always, the program allows to adjust the frequency, so please perform your own measurements and provide results.
Critique is good if it is based on facts, it is bad if it is based on fiction, aka "fan-boy-ism".