Big up
@James Freeman for not only this thread but also the old KVR one, and putting those clowns in their place who just weren’t understanding the importance of this.
I followed the guide last night for calibrating for Helix Native. My chain is a bit more complex as I’m not using a single interface, but rather a combination of separate DI, A/D, and D/A. Not only that, but the DI I’m using can directly feed the converters without the need for a preamp and the A/D has adjustable calibration for various dBu references.
My signal was coming in around 2dB lower than what would trigger the gate to open on/what I’d get using the HX Stomp, which seems right based on my experiences with using the HW. [EDIT: this is actually incorrect - I just tried calibrating again at 707mv RMS with the gate at -12.2 and my DI was calibrated absolutely dead on].
I then spoke to one of the devs who coded STL’s plugins (he’d previously offered to help me calibrate for using their software). I’m sure he won’t mind me sharing his procedure (which is very much what James suggested years back on KVR):
Sure, first thing to do is looping a sinewave in your DAW (better use a signal generator than a .wav file to avoid discontinuities in the output), something like 60Hz, with a maximum peak of 0dB, basically the clipping threshold.
Then set your multimeter in AC (1V range if possible) and connect a cable to the soundcard output (leave the other jack disconnect), then set your soundcard output level to the maximum.
Connect the multimeter scopes to the tip and sleeve of the cable jack and see what RMS value you get. Now tweak the DAW output level until you read exactly 0.707V on the multimeter, that's your settings for output unity gain.
Now connected the free jack of the output cable to the input of your soundcard in order to send the signal back into your DAW, just make sure the track receiving the input doesn't route the output to the master bus, or else they will sum and clip everything resulting in squeals.
Monitor the input and tweak the soundcard input level in order to see a peak of exactly 0dB: that's your input unity gain level. Note the soundcard input level settings somewhere (or take a picture) then decrease the level in order to read -6dB peak, note it somewhere then proceed to -12dB.
Now you know exactly the input settings for unity gain, -6dB and -12dB.
Connect your guitar straight to the soundcard input set to unity gain and play. Does it clip? No: ok, yes: set the input to -6dB and repeat. Does it clip? Set to -12dB and repeat...
Now you know what kind of attenuation the soundcard is applying to your guitar, so when you hit the first guitar plugin of your rig, make sure to boost the input to compensate the loss: if you set the input to -6dB, boost by 6dB, if -12dB, boost by 12dB.
Now the plugin sees exactly the same signal your amplifier sees when playing.
Keep in mind most amp sims have an internal boost to compensate for the fact that most users have their soundcard input level set low, for AmpHub this boost is +6dB, so if your soundcard is set to -6dB, set the AmpHub input level to 0, or to +6dB if your soundcard is set to -12dB. Disregard the Input Level Listener at this point, you have your levels set perfectly.
The final paragraph here is really the critical piece of information that we need to know from developers - there’s no standard amount of boost for any modeller or emulation. Emulations with less of an internal boost will favour a hotter signal, ones with a larger boost will want a lower DI signal.
For instance (if I’m understanding this right), STL Amphub wants a 6dB hotter signal than Helix Native.
It boggles the mind why this information isn’t freely available or even mentioned anywhere for the most part - do we as audio engineers calibrate tape machines and consoles by ear? Or are they regularly calibrated for any change in circumstances and components? Why should we treat any other gear any differently, particularly when there is so much amplification of a signal involved?
Developers understandably assume that it’ll go over the head of most users, but ultimately it just means we’ve set the bar too low. There’s already been far too much misinformation about modelling and blanket statements made without conditions being made fair.
I think it would be very handy if we can find the calibration levels from as many pieces of software and HW as possible and have it easily accessible. I can already anticipate some companies will be more willing to help than others, but the reality is everyone is likely to be using a combination of different software with a whole range of different HW.
I’m sure I read a post from IK‘s rep on KVR saying that they modelled their amp sims with different calibrations for each model….. if this is the case then
(but also please tell us what each of these values are, there’s enough space in the manual for it).