I have a s500 amp and premier dac did not go well for me to drive my estelon x diamond se.
No av pass true as pre amps, a bit bright and analytic for my taste which i think it comes from the premier but not s500 and i do not have the chance to go for a reference or cascade.
So is the s500 only designed to work with its own msb dacs or it is ok to use a other brnd pre amp with it? Ok there is a an impedance switch for this on s500 but is it ok to mix another brand’s pre amp?
I am now triying a otl tube pre amp.Allnic L-9000 OTL/OCL. Any problems with tube amps?
It is an otl so only rca is without output transfomerless. If i use a rca to xlr adapter at the input of s500 is it dangerous or will it work? S500 is xlr only thats why i ask
I use a tubed pre-amp with my Premier DAC. It allows the use of my phono amp, and injects just a tad of tube goodness.
ModWright LS300 - Atma-Sphere Class D monoblocks. Sorry to the purists here, but my 66 year old ears were trained from my early days on my Dad’s McIntosh MC60s + Klipschorns system.
Also, the current popular amps are too heavy to move easily. I chose lighter gear for retirement.
You can absolutely use a different brand preamplifier. Just make sure you change the impedance switch to 1.2k on the MSB amplifier first.
I would strongly advise against using a single-ended preamplifier. First, the Premier is fully balanced, so you are only using half of the DAC and, more importantly, you are not loading the output equally. If you need an XLR to RCA adapter, you should use ours.
The main issue is going from balanced to single-ended and vice versa. You can lose a massive amount of quality if not done right. Using cheaters is the wrong method.
I see elsewhere on the forum that it was mentioned to you to try different cables and speaker placement/system setup. I assume this didn’t have the desired effect?
Correction:
I was just told by Dustin that you can use an XLR to RCA cheater on the INPUT of the amplifier. This is no problem as long as pin 3 is connected to ground and not left floating. If left floating, you just won’t have any sound.
Dear Jonathan, I have a similar question. I run my Cascade via a Vitus MPL preamp to MSB M500s. The Vitus preamp has an output impedance of 80 ohm. Am I right that I should ideally set the M500 input impedance to 1.2k ohm? Would there be any harm if the M500 impedance is set at 75ohm?
At 1.2k ohm setting, the M500 sounds much Softer than at 75 ohm. About perhaps 6dB difference. Everything else was kept the same, gain setting at low. I am not a techie. I wonder if you would expect a higher impedance at M500 to sound softer or louder?
The amps impedance switches rewire the primary windings on the amps input transformer (the primaries are connected in various combinations of series or parallel) the amp does not change at all otherwise but will have less gain at 1.2k than 75 Ohms or 300 Ohms considering your preamps 80 Ohm output impedance. This will not be an issue because you are using a preamp for your gain. All preamps I know have plenty of gain and output voltage. If necessary you can also use the amps gain switch to increase the gain of the amplifier circuit (in about 6bB steps).
The exact gain change with different input settings is dependent on the impedance of your source (or preamp). For example if your source is 150 Ohms then switching between 75 Ohm input and 300 Ohm input will result in exactly the same gain for both positions. If that source had a 300 Ohm output then the 300 Ohm position would have about 3dB more gain than the 75Ohm position. Conversely a 75 Ohm source would have about 3db less gain on the 300 Ohm input setting than the 75 Ohm input setting.
Our amplifiers (unlike most others) essentially amplify the input power, not the input voltage. So 75 ohms being driven by a 300 ohm source results in less power transfer than a 300 ohm source driving a 300 ohm input. In fact maximum power transfer (therefore maximum gain) happens when the source and input impedances match. So a low impedance source (eg. 80 ohms) driving a high impedance input (eg 1.2k ohms) will have lower gain.
Thanks Dustin for your detailed explanation. If I understand correctly, I should expect to hear a lower gain using the 1.2k ohm compared to using the 75 ohm with my preamp source of 80 ohm. If so, may I know why there is a general recommendation to use a higher impedance eg 1.2k on the amp? What is the benefit or advantage there in terms of sound quality?
1.2k is an easier load to drive and your preamp is less likely to clip is my understanding. The only time a lower gain is an issue is if your preamp reaches 100% volume level before you’ve reached an acceptable listening level.
Hmmm.. not sure how that applies to MSB amps. At 1.2k, I find I lose about 6db of gain and volume so I have to crank up my preamp. It feels like a harder load to drive ..
Much easier to drive at 1.2k, just a bit less gain for low impedance output preamps. The 75 ohm input setting needs to have the manufacturer of the preamp OK the low impedance because it will seriously stress a lot of output driver designs.
Let me explain. It is much easier to lift your car using a floor jack that takes several pumps of the handle to lift it than to lift your car using only your arms. The jack is easier to use but requires more pumps of your arms (low gain), the brut force approach would lift the car faster but be much more difficult (high gain). The 1.2k setting is easier to drive (less current flow) but requires more voltage. The 75 ohm setting is much harder to drive (more current flow) but requires less voltage. Both settings require the same total power input to result in the same amplifier output. This is because power is equal to voltage times current flow.
To reach full rated power on medium gain (500W into 8 ohms) a 500 series amp requires about 40 milliwatts of input power.
I happened to talk to the designer of Vitus preamp. Like me, he had expected the 1.2k ohm input on the M500 to sound louder than the 75 ohm input, when driven by the 80 ohm output of the Vitus preamp. That being said, he also recommends 1.2k ohm.
The confusion probably comes because most amplifier designers expect amplifier input stages to respond to voltage only. He probably thought that the different impedances were resistors slapped onto the input, therefore a 75 ohm resistor would eat up some of the input voltage, due to the voltage divider effect, lowering the input voltage and making the result quieter. But thats not how the MSB 200 and 500 series amps work. The inputs respond to power ( voltage times current) not voltage alone.
With my Vitus preamp fixed at 80 ohm output, if I wish to have more gain, should I try to lower the M500 impedance from 1.2k to 300, or to increase M500 gain setting from low to medium to high? Or it doesn’t matter which approach . I am inquiring from a SQ and technical perspective. Thanks a lot.
If your using your preamp I would use the gain that it provides. Thats the whole pont of a preamp. If you aren’t at full volume on your preamp then listening I would keep the amp at 1.2k. If you need more gain use the gain switch on the amp.
If you don’t like the sound your preamp provides then remove it.