QC vs Helix

Well you would think measurements like fractal uses impedance , voltages that will have more impact on feel than eq
For instance a Bogner has that chew in the mids because of a low voltage in the preamp
So technically without seeing what the data being collected by Tina is it is hard to know ,
How are they getting the power amp data , things like Fractal is collecting transformers data like # of winds
I don’t know how Tina would do that with having a probe in the amp
I don't understand why you're talking about EQ? The paper that Laxu linked is pretty clear about the methodology. Because I'm roughly half an idiot, I can't speak to specifics. But.... it seems to me:

- They send sounds (test signals) into the actual amplifier and record what comes out at different knob settings (like gain, bass, treble).
- They use a type of AI (a neural network) to learn the patterns between the input sounds, the amp’s knobs, and the output sounds.
- This process captures the amp’s full behavior, including complex interactions between the input signal and the controls.
- Once trained, this AI can take any input sound, simulate how the amp would change it, and let you tweak the virtual knobs just like the real amp.

So in what world would that not account for the feel of the original amp???

This little side-quest discussion has nothing to do with the original EQ point being parsed from what bishopgame "guessed" in his original comments, that has somehow become a kind of gospel, with no evidence whatsoever I might add.
 
Meh, any sound I could get any of NDSP plugins I’ve tried was there in Helix with a low cut/high boost ahead of the amp, setting the master volume a little higher than stock, and making sure the output was hitting around -6 in the DAW instead of the usual -10/-12 that comes out of HX Native stock presets.
 
Meh, any sound I could get any of NDSP plugins I’ve tried was there in Helix with a low cut/high boost ahead of the amp, setting the master volume a little higher than stock, and making sure the output was hitting around -6 in the DAW instead of the usual -10/-12 that comes out of HX Native stock presets.
Interesting. So another person not experiencing the same thing as @Lysander !!
 
That's an oversimplification.

https://arxiv.org/html/2403.08559v1 details their current approach.

As far as I understand, TINA captures a sensible data set which is then fed to a neural network model which will churn out the final model. They figured a novel way to get enough data for different knob positions that they don't need a massive data set.

It's not individual captures interpolate or anything like that. It's just based on a data set from tons of test signals.
Thanks for linking the paper.

"To incorporate the conditioning variables to the model, we normalize them to the range [0,1], and concatenate them as additional input channels to the LSTM network. Based on preliminary experiments, we found that a single layer LSTM with 32 cells (denoted LSTM-32) provides a good balance between perceptual quality and real-time cost."

Their approach is a little less sophisticated than I thought. I believe their neural net has 32 inputs (and only one layer) which feed their underlying guitar amp model, and uses ML to get the positions of those controls that most closely match the comparison signal. I expect that a large portion of those variables are eq bands. They are just using ML to find the most exact positions of gain and eq settings.

The Tina mechanical controller is a way for them to devlop an integrated picture of an amp, like liquid profiles.

In the end, each control setting in their amp sim maps to the underlying variables (found with ML), that makes their amp sim sound like the real thing. As they said, 32 inputs is a good balance of perception and performance. However, their underlying model may still have limitations in what it can do. For instance, a real power amp might induce some eq changes among other things (some distortion). Because their model does not have a power amp sim, they can model the changes with EQ and gain. Their ability to express what an actual amp is doing is limited by those 32 channels.
 
Last edited:
I don't understand why you're talking about EQ? The paper that Laxu linked is pretty clear about the methodology. Because I'm roughly half an idiot, I can't speak to specifics. But.... it seems to me:

- They send sounds (test signals) into the actual amplifier and record what comes out at different knob settings (like gain, bass, treble).
- They use a type of AI (a neural network) to learn the patterns between the input sounds, the amp’s knobs, and the output sounds.
- This process captures the amp’s full behavior, including complex interactions between the input signal and the controls.
- Once trained, this AI can take any input sound, simulate how the amp would change it, and let you tweak the virtual knobs just like the real amp.

So in what world would that not account for the feel of the original amp???

This little side-quest discussion has nothing to do with the original EQ point being parsed from what bishopgame "guessed" in his original comments, that has somehow become a kind of gospel, with no evidence whatsoever I might add.
What I am saying is they can match a tonestack and influence it has of the freq
But it the non linear things that will be tough to catch with that method

Cliff said it in one of Johns helix Recto matches , if you match the eq curve 100% to the amps controls 90% of the guitar players will say it is the same

But it’s the nonlinear behaviours that are harder to produce

I saw that they mention they can get full behaviour but how does that technique differ or compromise on what Ben from Line6 or Cliff are doing by measuring individual components and using schematics ?
 
Never ending stooooooryyyyyyyyyyyy
Aaa a aaaa aaaa a aaaaa a aaaaahhhh
6509416db625c.webp
 
What I am saying is they can match a tonestack and influence it has of the freq
But it the non linear things that will be tough to catch with that method

Cliff said it in one of Johns helix Recto matches , if you match the eq curve 100% to the amps controls 90% of the guitar players will say it is the same

But it’s the nonlinear behaviours that are harder to produce

I saw that they mention they can get full behaviour but how does that technique differ or compromise on what Ben from Line6 or Cliff are doing by measuring individual components and using schematics ?
When AI takes over, at least those of us who studied AI will know why we got pwned.
 
What I am saying is they can match a tonestack and influence it has of the freq
But it the non linear things that will be tough to catch with that method

Cliff said it in one of Johns helix Recto matches , if you match the eq curve 100% to the amps controls 90% of the guitar players will say it is the same

But it’s the nonlinear behaviours that are harder to produce

I saw that they mention they can get full behaviour but how does that technique differ or compromise on what Ben from Line6 or Cliff are doing by measuring individual components and using schematics ?
I just don't think you understand what machine learning and neural networks are. They aren't just simply doing frequency matching. It accounts for all of the nonlinear behaviours that you are referring to.
 
I just don't think you understand what machine learning and neural networks are. They aren't just simply doing frequency matching. It accounts for all of the nonlinear behaviours that you are referring to.
A single layer neural net is not going to model non linear relationships very well. I skimmed their paper and is seems they are are using a single layer of 32 perceptrons.

They are using TINA to brute force a solution. They are taking snapshots at many different combinations of settings and using ML to find the mappings to their underlying model.
 
Are the non-linear behaviours what things like NAM and Neural Networks in general do particularly well?

Isn’t black box stuff generally used mostly for dynamic non-linear parts of circuits? I’m sure I read that UAD model their plugins like that - isolate the linear and non linear parts of their circuit and tackle them accordingly.
 
I just don't think you understand what machine learning and neural networks are. They aren't just simply doing frequency matching. It accounts for all of the nonlinear behaviours that you are referring to.
Then that’s a fair point I was not aware of that
Because their videos it just promotes TINA turning knobs and offering that data
Which technically a human could do albeit slowly
But if you are saying AI can also account for all the nonlinear behaviours like voltage sag, and impedance matching that’s great
 
Are the non-linear behaviours what things like NAM and Neural Networks in general do particularly well?

Isn’t black box stuff generally used mostly for dynamic non-linear parts of circuits? I’m sure I read that UAD model their plugins like that - isolate the linear and non linear parts of their circuit and tackle them accordingly.
That 2 pronged approach or combination of both methodologies would appear to have the best ability to capture a difficult or complex circuit design you would think
 
Dude, he's literally stating QC models are way too bassy :LOL: The shelf EQ thing is the exact same thing Igor mentioned as well.
He said:
any sound I could get any of NDSP plugins I’ve tried was there in Helix with a low cut/high boost ahead of the amp

Thus to make Helix match NDSP, he needs to cut lows on HELIX - not on QC.

Thus: Helix is bassier than NDSP, in his experience. Assuming he's not just being a total spoonface.

Also, he didn't say anything about shelves. You're obsessed with IKEA bro.
 
Back
Top