![]() |
Quote:
|
Quote:
|
Quote:
|
No, always. "Unproven Nyquist theorem".
|
Quote:
This has nothing to do with nyquist theorem. You test uses 1/100th of sampling frequency, not 1/2. Anyway, aliasing due to used frequency is OT. This is about 14-bit. Your attempt here is a red herring. |
Quote:
|
Quote:
|
Quote:
|
Quote:
What people perceive or what can be measured? |
Quote:
|
Quote:
Anyway, a bit more on topic: my understanding is that several methods for record/playing back digital sounds and that at least one of them (DSD IIRC) works by pushing the (in DSD's case large amounts of) quantization noise so far upwards in the spectrum it essentially becomes entirely irrelevant for what we hear. You can still measure it, but you won't ever hear it. This may (in a far more limited way) be what's going on here as well with Paula and it's high frequency "resampling". |
Quote:
Aud0Per = 128 Aud3Per = 125 Aud1Per = 128 Aud2Per = 127 Or maybe change interrupt source from Aud0 to Aud2 or Aud3? Jochen Hippel's mixing routine used originally Aud3 as interrupt source. |
I think it's good that quality of software audio output is questioned - some software may be old and we know more now and have better tools.
Noise. I replied on the topic of noise sources, because they directly correlate to SNR and thus measurement (sampling) of bit resolution. Top measurements and tools are required to know which is which. Certainly, ears are not good enough to say "this is 14-bit" or whatever bit. But a sound card released in 1999 or before is not the right way to measure it either. (However, if played from the source without relays, aliasing noise can be heard by human ears in quiet portions of normalized sounds - this is not the topic here.) If there's any resampling in the chain from input or generation to output and measurement, this will raise the noise floor from the aliasing distortion that comes from resampling. Volume. Volume must be maximized at the source to determine the noise floor. If it's played at half volume, the noise floor will be doubled or more to maximize it at the measurement point. This means no adaptation to the level of the measuring device. A logging oscilloscope will measure at the output level given, not be susceptible to impedance differences, and has measurement accuracy in bits in the spec sheet. Volume includes software. The Amiga doesn't have a way to control the analog amplifier. Any software that adjusts volume will scale it digitally, and this reduces the number of bits by math, since the Amiga doesn't have floating point DACs. Obviously it's assumed that the output is play at maximum volume, and then this paragraph is moot. The volume could be doubled if stereo only uses 2/4 channels. Assuming no phase issues, the same sound could be played through both channels in Left, and vice versa. If not, the the possible volume output of the DAC is halved, increasing the noise floor and therefore reducing the number of bits measurable. Phase. If the test sounds are actually stereo and the two completely different waves are measured independently, and the same sound is not doubled up (to increase volume as per above) in Left and vice versa, there can be no phase issues exhibited by the Amiga since the source sounds are two independent sounds and measured independently. If sounds in each stereo channel are modulated, the accuracy of modulation is chip internal, and calculations to generate the modulation wave must match. Modulation. Modulation is limited to 6 bits, and will not quite double the volume as if an 8-bit wave was played in both Left channels and vice versa, and thus give a more quiet output, directly translating to a higher noise floor. If this modulation yields a certain resolution depends in the end on the resolution of the Amiga DAC data bus inputs. Whatever resolution they manage is the maximum possible (not necessarily output) resolution per channel or per combined stereo channel. The output is then subject to internal analog noise floors and measurement equipment, which further deducts from this resolution. But this is the max. This is all there is to say on maximizing the production of bit resolution, in my mind. Just as I think the topic is good, I think it's good to question how we measure and whether we measure the quality of software. Because it can lead to better software. :great |
1 Attachment(s)
So, here is an FFT based measurement (probably to the pleasure buggs). My major error in the previous release is that that I measured the amplitude and not the power, hence FFT based SNR measurements were off by a factor of two.
This is a windowed FFT with a Hamming window, and a small algorithm upfront that tries to minimize leakage by taking windows at the zero-crossing of the signals. Other than that, the measurement results aren't substantially different. The Amiga is still at 36dB (a little bit less than with the previous time-domain based method), and the PC is at 60dB (worse than with the previous method, probably not astonishing as I'm not trying to remove phase jitter). Anyhow, the math works. It doesn't matter whether you measure in the time domain or frequency domain. The Amiga output is substantially worse than the PC output. Oh, before somebody asks, the output for the "perfect" input signal is 95.5dB, that is very close to the theoretical limit of ~96dB one would expect for a quantized signal. |
Quote:
There is no resampling going on, except the resampling of the ADC I'm using for measurement, and which applies to both the PC and the Amiga signal. This resampling is unavoidable. You can see a little bit of aliasing noise in the high frequency of the residual signal if you measure the PC output, but that's by several magnitudes (~40dB) smaller than the errors of the Amiga output. But, as always, everyone is invited to measure themselves. |
All times are GMT +2. The time now is 04:33. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, vBulletin Solutions Inc.