TwoNotes GENOME!!!

but that doesn't mean auto-leveling won't get it right in this day and age.
Auto levelling in a plugin doesn't know what guitar you have, what pickups you use, how they're wired, how high they are, what strings you use, what tuning, what pick, how hard you hit, what capacitance your cable has, what impedance your input has. A guitar isn't a good tool for measurement or calibration - thats why we use a constant voltage source like a sine wave where it has known, consistent and repeatable properties.

Loud pickups probably will clip 12-13dBu. 14dBu is probably a safer level although I'd also argue that the peaks that do get clipped won't make a great deal of difference in the long run. People have been using (and clipping the fuck out of) 5dBu input on Kempers for ages and no one seems to bat an eyelid
 
Auto levelling in a plugin doesn't know what guitar you have, what pickups you use, how they're wired, how high they are, what strings you use, what tuning, what pick, how hard you hit, what capacitance your cable has, what impedance your input has. A guitar isn't a good tool for measurement or calibration - thats why we use a constant voltage source like a sine wave where it has known, consistent and repeatable properties.

Exactly my concern.
 
No - but that is precisely what you'd be doing in case the plugin maker would deliver any number. Unless you'd do it with a testtone at a fixed level. Which you could do every bit as fine using Guitar Rig.
The number the plugin maker provides should have no bearing to the signal level of a guitar (this is precisely what we're trying to move away from) - it just needs to relate to the headroom of the input. As I said, using a guitar dBFS level is a terrible way to get accuracy.

Yes, you can use a fixed sine wave to measure, but also, you can just use your interface specs (which are often provided with interface gain at 0). Then, not only do you need to calibrate with "your loudest guitar", you can just set the input once and its correct forever. Easier, faster and more accurate.
 
Just do it like in Guitar Rig. Press "learn", hit the loudest thing your guitar can do and call it a day. Helix Native ie fine too as you can just trim the input level so it's sitting around that white little mark on the input meter.
Mate, MirrorProfiles is explaining it to you (he knows much more than me). You're doing it wrong, just as wrong as I was doing it until some clever guys showed us how to deal with input gain to get ACCURATE responses from the sims on the plugins. Not necessarily better... Just more accurate, more similar to what the real amps would work.

There are plenty of explanations, a dedicated thread, and also several YouTubers have joined and understood the logic behind it. Also, some plugin developers have understood it too. Not all of them have yet understood, but they will as soon as they realise they're wrong.
 
No, he's still not explaining which level I should set once and then forget about it.
 
Which is exactly what you'd get with your hottest guitar or a constant testtone. Doesn't matter whether things are adjusted automatically or by a set value.
using your loudest guitar isn't accurate - which pickups? how high are they? If the discussion is about accuracy then strumming a random guitar loudly isn't going to cut it. My idea of loud pickups may not be the same as someone else, and even with the same guitar and pickups other variables could affect the level.

Software could do it automatically if you feed it a known voltage sine wave - Tonocracy works like this.

Which input exactly? Your interface input?
If you don't want to measure with a sine wave, then its a possibility to just use an interface DI input or modeller. Those commonly provide "Maximum input headroom" as a spec. If you know your headroom, and you know the target level for the plugin you are using, then you can just compensate accordingly. I use a seperate DI box with notched gain levels. So I know the headroom of each setting and I can set however makes most sense with repeatable consistency. They can be measured either by using a known voltage sine wave, or comparing the level to a known interface input. Took me about a minute to work out
 
If you don't want to measure with a sine wave, then its a possibility to just use an interface DI input or modeller. Those commonly provide "Maximum input headroom" as a spec. If you know your headroom, and you know the target level for the plugin you are using, then you can just compensate accordingly. I use a seperate DI box with notched gain levels. So I know the headroom of each setting and I can set however makes most sense with repeatable consistency. They can be measured either by using a known voltage sine wave, or comparing the level to a known interface input. Took me about a minute to work out

All fine but that still doesn't answer my question.

you can just set the input once and its correct forever.

Which input exactly?
 
Whichever input you like - the point is you can use ANYTHING and be able to adjust accordingly. Its how you remove the variable of different inputs using different definitions of dBu->dBFS

You said I could set an input once. How would that work in case I was using different plugins?
 
You said I could set an input once. How would that work in case I was using different plugins?
As in, you set the input level once per plugin and then you never have to think about it again.

If you know your input headroom is 13dBu, then all you do is adjust by however many dB you need for a particular plugin. If your preamp gain is at minimum, you don't need to worry about changing the gain knob to record vocals and then recalibrate again.

Not really sure where you are trying to take this discussion because using a guitar to set the level will never provide levels as accurate as this method. There is literally no benefit to doing it that way - its bad advice that needs to be left in the past (when it comes to setting input levels for amp sims).
 
Yes. That's what I'm usually doing anyway. But as there's no "manufactur certified" numbers, I'm doing it by ear.
Perhaps this thread will be of interest to you - it has official values from most plugin manufacturers (including Native Instruments/Guitar Rig).


and a table here of common interfaces and many amp sims:



But yeah - the point is, doing it by ear has limited accuracy. If we're getting a response from an official rep, we may as well set the bar a bit higher and remove any ambiguity.
 
Perhaps this thread will be of interest to you

I've been reading it already. Right now, it's not of much interest as I'm hardly ever using amp sim plugins (apart from HXN to create patches for my Stomp), but even if I was, I'm still fooling with the input gain of my interface all the time and don't want to remember any exact gain pot positions all the time so the pre-ampsim gain plugins would work the way they should. Hence, I'm using my ears - which usually works just fine.
 
I'm still fooling with the input gain of my interface all the time and don't want to remember any exact gain pot positions all the time so the pre-ampsim gain plugins would work the way they should. Hence, I'm using my ears - which usually works just fine.
This is precisely the reason why its recommended to just set the gain to minimum and then you don't have to calibrate again - using your inaccurate method, you still have to re-calibrate with your guitar every time you adjust the gain pot.

But if you don't care about accuracy from the behaviour of the plugin, what point are you trying to prove here?

Anyone can adjust things by ear to something they like but that has no bearing on accuracy, which is what several people would like to know for Genome.
 
This is precisely the reason why its recommended to just set the gain to minimum and then you don't have to calibrate again

If I set my gain to minimum, the waveform display is incredibly tiny. Which is something I hate dealing with. Yeah, Logic has that "waveform widening" thing, but it's valid for all waveforms at once, so that's not of much use.
 
Back
Top