Hi everyone,
Ive been testing NAM lately and Ive been interested in the calibration feature. My understanding (correct me if im wrong) is that if an amp was captured at 12 dbu and my audio interface input max headroom is 20dbu i should basically setup my interface at unity (+0db gain) and set NAM input gain at +8db.
Thats basically what the code of the client is doing here: https://github.com/sdatkinson/Neura...b8/NeuralAmpModeler/NeuralAmpModeler.cpp#L653
Im using an old audiobox 44vsl interface with -30/+50db gain range control. My problem is that unity position (+0db gain) is not marked clearly on the interface knob, so im wondering, is there a simple way to establish it with certainty using some calibrated test signal somehow? Ive searched the web and thought about it but i cant figure it out, am i missing something obvious here?
At first i was thinking about generating some sine wave at a certain dbfs level (lets say -30dbfs) in my daw (ex: tone test in ableton), get it out of the interface with main output volume at max position (+0db), plug it back in and find the knob position where I have the same -30dbfs input level in daw.
But im not sure that this makes any sense, since to my understanding different interfaces will produce different outputs for -30dbfs depending on interface output level (+10dbu in my case).
Any help/explanation will be deeply appreciated.
Ive been testing NAM lately and Ive been interested in the calibration feature. My understanding (correct me if im wrong) is that if an amp was captured at 12 dbu and my audio interface input max headroom is 20dbu i should basically setup my interface at unity (+0db gain) and set NAM input gain at +8db.
Thats basically what the code of the client is doing here: https://github.com/sdatkinson/Neura...b8/NeuralAmpModeler/NeuralAmpModeler.cpp#L653
Im using an old audiobox 44vsl interface with -30/+50db gain range control. My problem is that unity position (+0db gain) is not marked clearly on the interface knob, so im wondering, is there a simple way to establish it with certainty using some calibrated test signal somehow? Ive searched the web and thought about it but i cant figure it out, am i missing something obvious here?
At first i was thinking about generating some sine wave at a certain dbfs level (lets say -30dbfs) in my daw (ex: tone test in ableton), get it out of the interface with main output volume at max position (+0db), plug it back in and find the knob position where I have the same -30dbfs input level in daw.
But im not sure that this makes any sense, since to my understanding different interfaces will produce different outputs for -30dbfs depending on interface output level (+10dbu in my case).
Any help/explanation will be deeply appreciated.
Last edited: