Calibrating Input Level for Plugins

Ok, so I decided to find out the correct input level for different plugins of the same amp (Soldano) by trying to match the amount of gain in the sound of each one. Assuming the Helix has the correct input level (I'm using HX Stomp as interface) I fixed the gain knob of each plugin at 5, with master at 3. In my setup, this combination had a very dynamic response.

I'm yet to try and vary the gain in each one and see how they respond with these levels, but here are results so far:

Helix Native: 0db (reference)
NeuralDSP Soldano: 0-0.5db
STL Solstice Solo: 9-10db (6db didn't make it, unfortunately)
Amplitube 5 SLD 100: 10db
Nembrini BST100 V2: 4-5db (this ones seems to have a different type of gain, very hard to match)

All of this totally depends on a lot of things and will pretty much vary.
Maybe with more tests by different people we start seeing some trend in the input levels.
 
Ok, so I decided to find out the correct input level for different plugins of the same amp (Soldano) by trying to match the amount of gain in the sound of each one

What if the 12ax7 tube models be each company have slightly different amount of gain?
What if each model has a different internal boost?
What if the model has been modeled incorrectly?
What if they don't have the real amps for reference and just model the schematic? <-- most common.

There are way too much variables to earball by amount of gain, that is true even if the company provided calibration values.
We only know that Helix has correct gain correlation between the models and real amps they modeled, Ben Adrian said many times that the model matches their real amp closer that two identical real amp.

The only other modeler that has accurate gain correlation between models and real amp gain is the Axe Fx 3, but that will never leave the hardware realm thus never need calibration.
 
Last edited:
What if the 12ax7 tube models be each company have slightly different amount of gain?
What if each model has a different internal boost?
What if the model has been modeled incorrectly?
What if they don't have the real amps for reference and just model the schematic? <-- most common.
Sure, all this really matters but, at the same time, after the level matching some models started sounding closer to others.

I did this not to write rules for each dev or amp sim but to show input levels really matters and can make an amp sim work (or not) for you.
 
Sure, all this really matters but, at the same time, after the level matching some models started sounding closer to others.
I did this not to write rules for each dev or amp sim but to show input levels really matters and can make an amp sim work (or not) for you.

Sure thing, I would do that too if there was no information and I know for sure the reference model is accurate.

Actually, I would do this:

1. Run normal level (around -15db) pink noise into the reference plugin with flat settings (G,B,M,T,P = 5,5,5,5,5) and Master at 1 (important to avoid poweramp compression), then freeze the spectrum with SPAN Plus.

2. Lower the pink noise to -70db (really quiet) and take another spectrum shot, now you have two reference points from the reference Soldano model. This step is the important part because it measures the input level at the linear region before the input distorts & compresses.

3. Load the other Soldano plugin and run -15db pink noise into it and match the Output Level (NOT Master) to the reference amp spectrum. Both amps should have the exact same GBMTPM setting, only match the Output Level.

4. Lower the pink noise input to -70db and match the Input Level until it matches the reference -70db spectrum, that is the required Input Level to match the actual gain of both model when the Gain Knob of both is at 5.

The result is far more accurate than earballing if both amp models supposed to have the same gain range.
This is technically "gain matching" which is part of tone matching, it's NOT the calibration we are talking about in this thread.
This really deserves a whole other thread.

In this example I had to boost the Input of the second Soldano by +6db to match the reference Soldano, all with the same settings.

Gain Matching.png
 
Sure thing, I would do that too if there was no information and I know for sure the reference model is accurate.

Actually, I would do this:

1. Run normal level (around -15db) pink noise into the reference plugin with flat settings (G,B,M,T,P = 5,5,5,5,5) and Master at 1 (important to avoid poweramp compression), then freeze the spectrum with SPAN Plus.

2. Lower the pink noise to -70db (really quiet) and take another spectrum shot, now you have two reference points from the reference Soldano model. This step is the important part because it measures the input level at the linear region before the input distorts & compresses.

3. Load the other Soldano plugin and run -15db pink noise into it and match the Output Level (NOT Master) to the reference amp spectrum. Both amps should have the exact same GBMTPM setting, only match the Output Level.

4. Lower the pink noise input to -70db and match the Input Level until it matches the reference -70db spectrum, that is the required Input Level to match the actual gain of both model when the Gain Knob of both is at 5.

The result is far more accurate than earballing if both amp models supposed to have the same gain range.
This is technically "gain matching" which is part of tone matching, it's NOT the calibration we are talking about in this thread.
This really deserves a whole other thread.

In this example I had to boost the Input of the second Soldano by +6db to match the reference Soldano, all with the same settings.

View attachment 2603

Tried it with very good results.

Thanks!
 
@James Freeman I've been away on travels with only a small iRig HD2 interface, and took the time tonight to follow your cal process. Turns out that the iRig HD2 is dead-on correct with its input gain wheel at minimum. I used the same sine generator plugin you used, cal'd the output to exactly 0.500VAC RMS, and the hard gate popped open with the input slider at -0.22dB with your threshold spec. Thanks again for working out the details of this for us to replicate.

TL;DR for others: the iRig HD2 gives the perfect input level into Helix Native when its gain wheel is set to minimum. That makes it a very attractive option for a small and portable interface for Native. :)
 
@James Freeman I've been away on travels with only a small iRig HD2 interface, and took the time tonight to follow your cal process. Turns out that the iRig HD2 is dead-on correct with its input gain wheel at minimum. I used the same sine generator plugin you used, cal'd the output to exactly 0.500VAC RMS, and the hard gate popped open with the input slider at -0.22dB with your threshold spec. Thanks again for working out the details of this for us to replicate.

TL;DR for others: the iRig HD2 gives the perfect input level into Helix Native when its gain wheel is set to minimum. That makes it a very attractive option for a small and portable interface for Native. :)
There's a weekly recording thang going on!

Give it a go!
 
Thanks James! Cool thread. I used the stock signal generator in Studio One. After following this procedure, my SSL 2 interface wants a Helix Native input of around -5.0 dB. Hope I did it right...
 
Last edited:
SSL2 Spec for Instrument Inputs:

SSL2 Instrument.png


+15dBu is 4.356 RMS.
20*log(0.500/4.356) = -18.8 dBFS
Our target with 0.500v is -15.3 dBFS.

Going by manufacturer specs alone you had to boost +3.5dB, if you did everything correctly it demonstrates that specs can be FAR off.
 
irig hd2.png


8.36 Vpp = 2.956 RMS.
20*log(0.500/2.956) = -15.43 dBFS.

That's 0.1dB from our target of -15.3 dBFS as confirmed by @SwirlyMaple.
Kudos to IK for providing very accurate specs.

EDIT:
By deduction that could mean that all IK products (Amplitube, ToneX) have a calibration value of 0.500 = -15dBFS, just like Helix/Native. :unsure:
 
Last edited:
SSL2 Spec for Instrument Inputs:

View attachment 3711

+15dBu is 4.356 RMS.
20*log(0.500/4.356) = -18.8 dBFS
Our target with 0.500v is -15.3 dBFS.

Going by manufacturer specs alone you had to boost +3.5dB, if you did everything correctly it demonstrates that specs can be FAR off.
I probably screwed it up if I'm off by 8.5dB. I'm going to try again later today using the same tone generator you used.

You have the output knob turned all the way up on the interface when taking the reading with the multi-meter, correct?
 
James, I can't express how thankful I am for your thorough research and this very thread. <3

I sat down and played through Native this morning, after sticking to non-computer-bound stuff for a few months, and its the first time the amps actually sound "right".

I'm using an older Focusrite Scarlett 2i4, which is decent enough for me, but in the past I always made the mistake of calibrating the input levels interface-based, rather than leaving them at zero.

I tweaked and tweaked until it sounded alright, but I've never been 100% happy - until now.

Truly astonishing. Everything sounds so good now! :) Thanks very much, highly appreciated.
 
Bumping this after some testing.

I'm really starting to believe that IK Amplitube wants 0.707vRMS at 0.0dbFS, that's where the amp models are most similar to Helix in terms of gain.
A lot of SPICE based software modelers without internal compensation treat the full digital scale as virtual 1vPeak (0.707vRMS).
 
checks out, I’ve found myself needing to crank AT5 a fair bit to get it feeling right. Sort of wish they had a global calibration preference because IK stuff wants signals that are WAY louder than I’d ever record with and I’m always having to add gain at the input when using their guitar stuff.
 
Back
Top