Groundhog Audio OnePedal Kickstarter Next Week

MixMinisterMike

Groupie
Messages
46
Using AI stems analysis to replicate guitar tones in songs:

This sort of solution has been predicted for a couple of years. Also looks to use some post-analysis modeling to create the signal chain and adapt that to the guitar you're using (kinda how BOSS has had per-guitar settings in the GT-1000).

Anecdotally, I've had several conversations with cover guitarists lately who have expressed to me they wish modeling could just give them the right tone, instead of them having to learn an entire toolbox to get there. Kinda flies in the face of "all presets suck." While almost all of us here want a complete toolbox, there's a large portion of guitarists who just don't want to spend that time with their gear and just want to focus on playing.
 
We Are Doomed Reaction GIF
 
I'm curious how accurate it can get.

I'm using the top tier version of Moises right now to make stems out of my band's demo so I can replace all the guitars/bass, the AI stem splitting isn't 100% accurate/clean. Then the whole mastering process and how that changes guitar tones is a factor of consideration. That said, it's a nify idea and I'd imagine the people that'll buy it aren't going to be hugely concerned about accuracy.

The whole "I don't want to touch knobs/dial in a tone" is so foreign to me because that part is just as much fun to me as playing, sometimes more, but I do see it a lot more often. I have to assume this is something technology has brought on; growing up it was pretty much customary to take an interest in gear and tone searching so you can get what you wanted out of your playing/tones. The entire reason I learned how to record/mix is because it drove me absolutely crazy putting effort into tones just to have someone else do what they wanted in the mix process after. I can't even wrap my head around "Please provide me with my tone"
 
The whole "I don't want to touch knobs/dial in a tone" is so foreign to me because that part is just as much fun to me as playing, sometimes more, but I do see it a lot more often. I have to assume this is something technology has brought on; growing up it was pretty much customary to take an interest in gear and tone searching so you can get what you wanted out of your playing/tones. The entire reason I learned how to record/mix is because it drove me absolutely crazy putting effort into tones just to have someone else do what they wanted in the mix process after. I can't even wrap my head around "Please provide me with my tone"

The main reason this tech sucks. Mrs Doyle said it better than I ever could.

1000144811.gif
 
I get it but isn't it sort of taking all the fun out of guitar playing? It's like buying a new car and there's no wheel since it's a self driving one.. I mean sure if you hate driving (tone sculpting in this case).
Pretty much. I hate "solutions" like this that try to automate away aspects of playing music.

I think many guitarists go from emulating their heroes, or trying to match the tones of records, to finding their own sound. This sort of product seems to take out a big part of that journey, which means that when the user moves on to something better they'll be totally lost on how to use it.

The tone I dial for myself is a very important part of musical expression.
 
I've had several conversations with cover guitarists lately who have expressed to me they wish modeling could just give them the right tone, instead of them having to learn an entire toolbox to get there.

This attitude is part of the death of live music. Cover bands used to put their own spin on the music and it was a heck of a lot more interesting. One of the downsides of modeling is it has allowed cover artists to chase album tones for every song in their setlist, and now you might as well just listen to the original recordings. Yuck.
 
I don’t know about the death of live music, but it’s certainly a fast track on the continuing homogenization of what music will sound like. Honestly a cover band is the perfect place for this. You’re already throwing out creativity, no reason to labor over tones.
 
I don’t know about the death of live music, but it’s certainly a fast track on the continuing homogenization of what music will sound like. Honestly a cover band is the perfect place for this. You’re already throwing out creativity, no reason to labor over tones.

Wedding bands especially. People want their favourite songs to sound like they do on the record.
 
This attitude is part of the death of live music. Cover bands used to put their own spin on the music and it was a heck of a lot more interesting. One of the downsides of modeling is it has allowed cover artists to chase album tones for every song in their setlist, and now you might as well just listen to the original recordings. Yuck.
Not to mention when other artists record covers, they often put their own spin on them - sometimes even a drastic re-imagining like say Johnny Cash with Nine Inch Nails' "Hurt".

Now imagine if all those artists had just phoned it in with an AI tool to make it sound like the original record.

Even for cover bands I am not looking for them to be a jukebox, but make their own interpretation of it. If it's well done, yes it will sound pretty similar to the original, but not identical.
 
Back
Top