Jump to content

Recommended Posts

Posted

Just posting an observation I've made with digital transports. I'm sure jitter plays some part in why some sound different from others. But something else I've noticed lately is that the output signal level from a digital transport also seems to make a difference with regards to sound quality. I noticed this recently since I have a few DACs right now in the house, all with varying analog output levels. Ranging from 2- 3.5 vrms through the RCA (single ended) output. Sometimes the output on the high output DACs is just a bit too much with my amps (lacking headroom on the headphone amp's volume pot is annoying), which is what started me to mess around with the volume level of the digital transport.

The output level is adjustable on the M-Audio Transit and the Squeezebox v3. With the Transit the output level is maxed out in Windows and this is not adjustable. But using Foobar2000 you can adjust the output level of the digital signal. And with the Squeezebox v3 it has a built in volume control (adjusts both its analog output and digital output), which is supposedly defeated when the volume is maxed.

What I've noticed is that when lowering the volume on either transport to about 80% volume, and using the headphone amp to increase the volume on the pot results in a degradation of resolution compared to leaving the digital volume at 100% and using a lower volume on the headphone amp's pot. I tried my best to volume match using a Ratshack SPL meter during these comparisons.

This isn't so noticeable on the HD650, but when I switch over to the Omega 2 the difference in resolution is quite obvious; this manifests itself in less micro detail and slightly lesser dynamics. I have no idea if lowering the volume on those transports to 80% is resulting in truncated bits or not.

I'm wondering if this has something to do with the possible differences in CD player (and sound card) digital transports? ie a CDP putting out a lower level digital signal than another competitor.

Any thoughts? Has anyone else noticed this or experimented?

Posted

The two things you are looking for from a digital transport are:

a) low, clean jitter

B) bit-perfect output

By altering the volume (or applying any sort of EQ, or other processing), you are throwing B out the window. Most simple digital volume controls basically throw away information (starting with the "least significant") in order to reduce the level. In fact, on Windows, if you don't use ASIO to output from your player (foobar, etc) straight into your output device (fw/usb interface, sound card, whatever), Windows' built in audio subsystem (kmixer) will jack around with the bits and throw B out the window.

One fun experiment is to try playing a lossless rip of a HDCD-encoded track into a HDCD-equipped DAC. You'll notice that the moment you play with the volume on the digital source, the HDCD light on the DAC goes out. Since HDCD encoding lies in the "least significant" portion of the digital bitstream, it's the first to go...

On another note, bit-perfect is bit-perfect. The majority of CD transports output bit-perfect, which is the same as a sound card or Squeezebox outputting bit-perfect. If a digital source is not true to the ones and zeros of the source material, it isn't bit-perfect.

Interestingly, though on a bit of a tangent, prior to algorithms such as Apogee's marvelous UV22, even in the analog-digital converting process (i.e. going from a 2 track analog tape master to a DAT tape or other digital format for digital distribution), you really had to keep the signal level as high as possible without going "over" digital 0 in order to achieve maximum resolution from the redbook-required 16 bits. If you don't feed the ADC a hot enough level, you'll never use all 16 bits available. Even with the help of UV22 adding "inaudible" noise to keep more bits in the game, ideally you still want your digital master to peak at digital 0 and not go over, which results in clipping. UV22 means you can get a fair amount more resolution during quiet passages, allowing for greater usable dynamic range. I actually think that 16/44.1 Done Right is Not That Bad...

Posted (edited)

One fun experiment is to try playing a lossless rip of a HDCD-encoded track into a HDCD-equipped DAC. You'll notice that the moment you jack with the volume, the HDCD light on the DAC goes out.

Very interesting. One of my DACs is HDCD equipped. Any suggestion for playing back HDCD material (and keeping the HDCD flag intact) with Windows? IIRC WMP9 (or 10) used to be able to do it.

None of my CDP digital transports have variable analog or digital output.

In fact, on Windows, if you don't use ASIO to output from your player (foobar, etc) straight into your output device (fw/usb interface, sound card, whatever), Windows built in audio subsystem (kmixer) will jack around with the bits and throw B out the window.

I believe with the M-Audio Transit the default drivers are already bit perfect once you select "M-Audio Transit" as the output device in Foobar2000. The DTS pass through test to a HT receiver confirms this and ouputs a clean signal rather than just noise.

Which raises the next question:

Can you have bit perfect CDPs with varying digital output levels?

Edited by deepak
Posted

foobar with asio and a properly ripped hdcd -> lossless will turn on the light.

I see people who post hdcd rips that don't turn on the light. I generally attribute that to them fucking up the rip.

Posted
Very interesting. One of my DACs is HDCD equipped. Any suggestion for playing back HDCD material (and keeping the HDCD flag intact) with Windows? IIRC WMP9 (or 10) used to be able to do it.
WMP 9/10 tried to do HDCD decoding internally, therefore would NOT pass HDCD information on to a DAC. If you are using Foobar and can pass the "DTS" test, you'll output HDCD information correctly.
I believe with the M-Audio Transit the default drivers are already bit perfect once you select "M-Audio Transit" as the output device in Foobar2000. The DTS pass through test to a HT receiver confirms this and ouputs a clean signal rather than just noise.
In that case, it sounds like you are outputting bit-perfect and will be able to light the HDCD light on a HDCD-equipped DAC.
Which raises the next question:

Can you have bit perfect CDPs with varying digital output levels?

No. Altering the volume level in the digital domain by definition alters the bitstream.
Posted
No. Altering the volume level in the digital domain by definition alters the bitstream.

I may be on the wrong track here by misunderstanding something that I have read...... but what if you run 16-bit material through a 24-bit or higher DAC, and at 0dBFS output you just fill the 8 bottom bits up with zeroes?

Posted
I may be on the wrong track here by misunderstanding something that I have read...... but what if you run 16-bit material through a 24-bit or higher DAC, and at 0dBFS output you just fill the 8 bottom bits up with zeroes?
Correct. The "top" 16-bits are "bit-perfect" and the "bottom" 8 are inactive. Full 16-bit resolution is maintained.
Posted

I thought so :)

The guys at Twisted Pear are big proponents of digital volume control with the new Sabre 32 chip. They argue that the wide data path and high DNR allows much better attenuation than what people are used to from digital devices.

Posted
I thought so :)

The guys at Twisted Pear are big proponents of digital volume control with the new Sabre 32 chip. They argue that the wide data path and high DNR allows much better attenuation than what people are used to from digital devices.

I'm not familiar with how the Sabre 32 chip works but indeed there are ways to do volume control in the digital domain without adversely affecting resolution, however it will not be perfect and would not pass DTS or HDCD encoding... as an advanced digital volume control solution doesn't apply to what deepak has at hand at the moment, I have been avoiding this theoretical discussion in order to ram the point home that simple digital volume manipulation results in decreased resolution by throwing away bits.
Posted

I read at the back of a Stereophile review of the DAC1 the Benchmark engineer encouraging the use of the digital volume control in iTunes, which at least on a mac runs by default at 24bits with what is considered to be an extremely good SRC.

My understanding is that, as long as you are outputting a digital signal greater than the actual resolution of the file being played, you can digitally attenuate the signal without loss until, in the 24bit stream playing a 16bit original, you have thrown away all 8 unused bits.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.