Calibrating Input Level for Plugins

This thread has fired up a discussion between me and another developer I've had contacts with in the recent past, so I invited him to join me on a livestream on my youtube channel to discuss this publicly.

He is Alessandro from APFX Audio and we'll be live tomorrow at 3PM GMT

We will speak Italian but if some of you guys want to chime in we'll try to answer some comments/questions in English too.

 
I’m attacking y’all for...
There you have it. Noone is attacking anyone but you and the Devs who say that We mislead or spread false dogmas.

I've got nothing else to discuss with you. Not going to apologize for defending myself.

That said, if someone says something false, I'll point it out... I don't care if you consider it an attack.
 
Last edited:
I just wish plugin companies would say the spec of their plugin and people can do what they want with that info, its literally that easy. As someone who was hoping that they'd give this information out, it was just a long way of not giving out any information and to just guess / "use your ears bro".

Exactly. I wouldn't buy an audio interface with no specs published by the manufacturer, no one would do that, and I am not buying any guitar amp plugin if I don't know the specs. Before I buy I check the web site and manuals, if I don't find what I am looking for I'll contact support, if I can't get an answer I'll move on, my money their loss. I am tired of companies playing the "accuracy" card and then they don't deliver that accuracy at all. I don't mind a plugin built from scratch with some unique sound not being accurate, but if you mimic a real amp to they point of acquiring the right to use the original brand (Tone King Imperial MKII, Morgan amps, Milkman Creamer, Benson Chimera, Fender collections on Amplitube) it is because you are selling accuracy, and if I am buying accuracy that's what I want.

And no, I don't mind breaking accuracy if I find a tone that I like, but most of the time I am curious about the gear I am playing with and I like to feel and hear how it behaves.

There are lots of benefits you get from calibration, I can create a preset and some else can use it as I intended it to sound like as far as we all have some common ground, I can see a video about someone dialing a plugin and I can get the exact same results, I can work in a project with other guitarists and the settings can be perfectly portable across different systems, if you want the character of a Princeton 65 you can go and pick that amp to get that kind of sound instead of using a triple rectifier because with my settings it sounds like a Princeton 65, etc...
 
I love how the goalposts keep moving on this topic... Since forever buying a Fractal, Helix, Kemper Profiles etc etc was always about getting a rock solid digital representation of gear (FAS/Helix had extra idealised amps as well). Noone really expected early podfarm to be that accurate (ballpark stuff), JST stuff was idealised and good enough. Amplitube 3/4/5 were implying their gear was faithful recreations, NDSP with Nameless and their lineup also implying faithful recreations.... the whole point about buying these BRAND name digital amp sims is to get a highly faithful experience --> great sounds... like isn't all of that completely obvious?

Ed, James and the wider nerds start championing amp sim accuracy with some basic steps to hit the plugin input spec, hooray... this is great!

All of a sudden... "I dunno man, that's all too hard, I just ride the input gain to where it sounds about right... dId yOu aCtUaLlY tHiNk tHiS iS a rEaL aMp?" The denial is just super funny to me. Wasn't the whole point in technology getting better / new amp sims coming out all in the pursuit of the best recreations possible? And with about 2% effort you can calibrate your setup to what the devs configured the plugin to.

I don't care what anyone else does on their machine I'm not here to preach or lecture anyone, many a times I don't even bother adjusting if I'm just practicing or pulling a two second tone for a quick mix. But to go out of your way to say "accuracy is pointless", "ballpark is good enough who cares" and all those kinds of things, just a big yikes.

I'm not responding to anyone in this thread, just having a thought haha. As a side note it was great to see Steve from NAM asking the community what the input spec of NAM should be... Wow, It's almost like calibration is real and exists... and optimising things will give everyone "the best" results with the same software, who would have guessed.
1731651420266.png
 
I'm not responding to anyone in this thread, just having a thought haha. As a side note it was great to see Steve from NAM asking the community what the input spec of NAM should be... Wow, It's almost like calibration is real and exists... and optimising things will give everyone "the best" results with the same software, who would have guessed.
View attachment 32999
That was honestly a terrible way to set up the poll. It becomes a simple popularity contest and Focusrite Scarletts are pretty popular. I mean the UA Arrow is exactly the same but people picked the Focusrite option nevertheless.

If he had just listed the dBu values I can bet the numbers would look different. Either very few votes because people don't understand the question, or most likely picking the highest number.
 
That was honestly a terrible way to set up the poll. It becomes a simple popularity contest and Focusrite Scarletts are pretty popular. I mean the UA Arrow is exactly the same but people picked the Focusrite option nevertheless.

If he had just listed the dBu values I can bet the numbers would look different. Either very few votes because people don't understand the question, or most likely picking the highest number.
Also allowing any old peenarse to add to the poll, assumes that they've got their information right.

For example, the UCXII is not explicitly +13dBu. It is actually switchable on the input between +19dBu and +13dBu. It is switchable on the output between +19, +13, and +4 dBu.

Just assuming that everyone is going to use the +13dBu setting is dumb.
 
That was honestly a terrible way to set up the poll. It becomes a simple popularity contest and Focusrite Scarletts are pretty popular. I mean the UA Arrow is exactly the same but people picked the Focusrite option nevertheless.

If he had just listed the dBu values I can bet the numbers would look different. Either very few votes because people don't understand the question, or most likely picking the highest number.
I dunno what’s best but neural and a bunch are around +12-+13dbu. Coincidentally a bunch of interfaces are also around that number on 0 gain… seems to be a good ballpark to run with.
 
Also allowing any old peenarse to add to the poll, assumes that they've got their information right.

For example, the UCXII is not explicitly +13dBu. It is actually switchable on the input between +19dBu and +13dBu. It is switchable on the output between +19, +13, and +4 dBu.

Just assuming that everyone is going to use the +13dBu setting is dumb.
Also turning the instrument mode on (for proper impedance) adds +6db so +19dbu with inst on is actually +13dbu and +13dbu turns into +7dbu. Would kind of make more sense if those figures changed once you turned the mode on. Wouldn’t surprise me if people selected +13dbu with inst mode on and then wondered why things were so hot (because it’s actually +7dbu)
 
Also turning the instrument mode on (for proper impedance) adds +6db so +19dbu with inst on is actually +13dbu and +13dbu turns into +7dbu. Would kind of make more sense if those figures changed once you turned the mode on. Wouldn’t surprise me if people selected +13dbu with inst mode on and then wondered why things were so hot (because it’s actually +7dbu)
Yeah - that happened to me lol
 
Also allowing any old peenarse to add to the poll, assumes that they've got their information right.

For example, the UCXII is not explicitly +13dBu. It is actually switchable on the input between +19dBu and +13dBu. It is switchable on the output between +19, +13, and +4 dBu.

Just assuming that everyone is going to use the +13dBu setting is dumb.
Hold the insults will ya?
 
I truly wonder how many people have actually run a 1Vp singal through their instrument input, because using the headroom of that input as a reference doesn't necesaryly is going to give you the right value to adjust your gain. My main interface has 14dBu of headroom but with 0 preamp gain and a 1Vp@1kHz signal I get -17.5dBFS. I am using a precise signal generator, and I measured the signal using an osciloscope 1Vp = 0.707vRMS, so the recommendation of 14 + 0.79 would be almost 3dB too weak in my case if I want to reach 0dBFS.

Without knowing how DA has been implemented in the audio interface you can't really know how an analog signal is going to translate in the digital realm.
 
I suspect most don’t measure, but I’ve tested a lot different inputs of my own, and helped a bunch of people online.

There are some anomalies, like presonus’s gain knobs being 15dB below the value stated in the manual, RME having 6 or 8dB on the preamp that isn’t accounted for in the value in the manual etc.

Most inputs I’ve measured seem to be very close to the stated value though, within +/- 0.1dB. Generally speaking I’ve had D/A line outs vary in level more, maybe around +/-0.5dB on some interfaces. 3dB is quite loose for tolerance/variance from stated specs, enough that I’d probably even check with the manufacturer to see if that’s normal.

It’s best if people are going to measure but most simply won’t do it, even people who have the gear and capacity to do it simply CBA. They just have to accept that it’s a quicker and easier approach at the expense of the best accuracy.
 
I suspect most don’t measure, but I’ve tested a lot different inputs of my own, and helped a bunch of people online.

There are some anomalies, like presonus’s gain knobs being 15dB below the value stated in the manual, RME having 6 or 8dB on the preamp that isn’t accounted for in the value in the manual etc.

Most inputs I’ve measured seem to be very close to the stated value though, within +/- 0.1dB. Generally speaking I’ve had D/A line outs vary in level more, maybe around +/-0.5dB on some interfaces. 3dB is quite loose for tolerance/variance from stated specs, enough that I’d probably even check with the manufacturer to see if that’s normal.

It’s best if people are going to measure but most simply won’t do it, even people who have the gear and capacity to do it simply CBA. They just have to accept that it’s a quicker and easier approach at the expense of the best accuracy.
But it is going to depend on the reference level (dBu to dBFS) used, are we sure it is standard for all audio interfaces? I don't think it is as far as I know...
 
I keep seeing their black friday ad’s and I do really want these guys to succeed. I just can’t help thinking that despite what they say here, it’s an refutable fact that there will be 1 exact input level that will yield the most similar experience in gain levels to their real amp. By definition, it has to exist, and it’s pretty trivial for a dev to work out what it is and disclose it.

They’ve been a bit stubborn on other things before but have slowly added them in, and I think their intentions are good even if it takes a bit of work to get there. I think €40 or so is a fair price for basically any amp sim these days, and certainly for a smaller independent company I’d feel like it’s less of a gamble.
 
But it is going to depend on the reference level (dBu to dBFS) used, are we sure it is standard for all audio interfaces? I don't think it is as far as I know...
how do you mean? If we know the analog level that yields 0dBFS for a particular interface, then we know the relationship for any input level and it’s digital level.

All interfaces have their own specs, and tolerances and other variables in design may influence how they provide the information.

The ideal is to check, but generally speaking, the manuals specs are correct and helpful.
 
how do you mean? If we know the analog level that yields 0dBFS for a particular interface, then we know the relationship for any input level and it’s digital level.

All interfaces have their own specs, and tolerances and other variables in design may influence how they provide the information.

The ideal is to check, but generally speaking, the manuals specs are correct and helpful.
I mean that the maxium input level refers to the highest analog signal level (dBu) the preamp can handle without clipping (in the analog domain) before being converted to digital, that doesn't mean that it will match 0dBFS (digital domain) at that level.

So you need the pream specs + the DAC reference level to exactly calculate the signal level in the digital domain, or measuring the dBFS you get using a reference signal.

One audio interface may have a reference value (analog to digital) of +4 dBu = -18 dBFS and another one may have +4Bu = -20 dBFS. So if I am not terrible mistaken, using the maxium input level of an input is not enough to get a proper calibration.
 
Back
Top