![]() The only way there could be a difference is if the audio was mastered and then encoded by say TrueHD, and then mastered again differently and encoded by say DTSHDMA.however even if it was encoded again with TrueHD it would sound different as well. Provided the source material is masted and then the same data is encoded, it's the same. However that is not an actual description of quality, just quantity.ĭTSHDMA, WAV, PCM, LPCM, TrueHD & FLAC are all lossless codecs and therefore mean they have the exact same information in them and will sound identical with a properly set up sound system and the master volume adjusted so the volume coming out of the speakers yields the same decibels. Many people, yourself included apparently, feel that louder is better. I hate to be the one to say this but you're confusing the issue of volume with sound quality. However, both DD & DTS will boast data rates, efficiency, etc, but what actually translates to better sound is a very ambiguous matter." qoute from audioholics. However, even if DD is slightly more efficient, it is still not 1.5 /. DD tends to boast that its encoding method is more efficient than DTS and thus does not require the extra bit rates. A higher bit rate must imply DTS will be superior sounding right? In theory, the less compression used in the encoding process, the more realistic the sound will be, as it will better represent the original source. DD compresses a 5.1 channel surround track to 384 kbps to 448 kbps (DVD Standard limited, DD has the potential of up to 640 kbps) while DTS uses much higher bit rates up to 1.4 Mbps for CD's / LD's and 1.5 Mbps for DVD. "In order to minimize the limited space allocated on a DVD for audio soundtracks, DD and DTS utilize lossy data reduction algorithms, which reduce the number of bits needed to encode an audio signal. If anybody knows more about this, we'd like to know more. Since DTS-HD does not use Normalization, therefore the processed signal is relatively bit for bit. Correct me if I’m wrong but my understanding about Dolby True-HD (which I read somewhere) is when normalization is applied the encoding process the audio attenuation on the audio encoder is set to -2dB or is it -4db? (dialog normalization default) resulting in a lower SPL relative to DTS at an equivalent gain setting, so when it's decompress a. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |