Line 6 Helix Stadium Talk

I swear a few months ago there was some discussion about it all. Maybe it was something that'll be in a future firmware, but I can't remember.
Perhaps DI mentioned it.
Either way, I'm fairly sure it's controllable with or without socks or crocs.
 
  • Like
Reactions: Dez
So, is there no viable use case for that XY feature in a performance scenario where the device is on the floor and both your hands are on your guitar, and you don’t have a toe to spare? Even if you had an iPad and it had the capability of mobile control - if you have both hands on your guitar it’s still not viable? Is that something you can automate and use in showcase?
You needn't always have both hands on the guitar. Consider as one scenario: you sound a chord, your left hand continues to sustain that chord, and your right hand reaches out to control parameters on one or more effects. (I used to use this a lot with resonance and cutoff on filters on synth voices. You could also use it to control mix, feedback, rate on any time-based effects.)

The fact that the Stadium might be on the floor is a legitimate concern, but again, the Stadium doesn't have to be on the floor. That's up to the user according to his/her requirements.
 
You needn't always have both hands on the guitar. Consider as one scenario: you sound a chord, you're left hand continues to sustain that chord, and your right hand reaches out to control parameters on one or more effects. (I used to use this a lot with resonance and cutoff on filters on synth voices. You could also use it to control mix, feedback, rate on any time-based effects.)

The fact that the Stadium might be on the floor is a legitimate concern, but again, the Stadium doesn't have to be on the floor. That's up to the user according to his/her requirements.
My right hand is never chillin’ like that 😈


Ryan Reynolds Smile GIF
 
Last edited:
So, is there no viable use case for that XY feature in a performance scenario where the device is on the floor and both your hands are on your guitar, and you don’t have a toe to spare? Even if you had an iPad and it had the capability of mobile control - if you have both hands on your guitar it’s still not viable? Is that something you can automate and use in showcase?

Put the iPad on a stand at crotch height...or maybe face height and use your nose? Just pick one and stick to it!
 
Oh, I agree completely. I don't play barefoot in my own home, let alone some random dive, and I don't like the idea of a touch screen on a floor device. The touch interface needs to be remote on an iDevice or Android thingy to make much sense.
Are you saying my toe cheese sensor idea has a limited market? Damn it.
 
you can map a pre determinded trajctory on the XY plane to an expression pedal.

don't own the unit but should be somenthing like, you draw the path on XY plane and the expression pedal will follow it.

If that's possible, figuring out how to do it is not at all intuitive.
 
So, is there no viable use case for that XY feature in a performance scenario where the device is on the floor and both your hands are on your guitar, and you don’t have a toe to spare? Even if you had an iPad and it had the capability of mobile control - if you have both hands on your guitar it’s still not viable? Is that something you can automate and use in showcase?
Expression pedals mapped to control the XY parameters. Can control one or both axises at once. You could use multiple expression pedals for per-axis controls, and there's at least one dual axis controller on the market (looks like an elephant's foot, don't remember the brand).

The XY controller screen could be used for e.g recording where you would reamp your sound and could use XY screen to manipulate the sound.
 
Back
Top