Boss GM-800 and GK-5

Sloppy playing is somehow ok on guitar A guitar synth exposes these weaknesses in a player
When I hear GM800 P2M tracking error from technically proficient players I don't believe your statement is entirely true. In one of the GM800 YT, one such technically proficient guitar player stated that the tracking errors are still there.

The point is they don't have to be with current DSP modeling technically available in software.

Supporting Roland's product repackaging market strategy doesn't help get us a hardware guitar synth that can accurately model acoustic instruments and actually works for everyone.
 
But the underlying fact is nothing is really improved beyond what was already out there.

I disagree. The sounds / samples in the gm-800 are far beyond better then anything roland / boss has done. Definitely not the same sounds as the gr-55 or older ones. Also name me one of roland / boss guitar synths where you can layer 4 parts? ;) So I think those are big improvements?
 
Last edited:
What's the target audience size for a box like this? Super shocking that Boss would repackage something.
The target audience is we who are attracted to bright shiny objects and support the repackaging marketing strategy in the hope that it actually works this time.

That used to be me BTW, but I now understand that if I keep supporting that marketing strategy, it will just keep happening. We will just keep getting repackaged 90s technology.
 
Nice edit, @Trollbait. He said I was talking out of my ass. I responded in kind.
That's why I removed my post and I didn't want to seem like I was attacking you.

But the point I made in that removed post point stands.

When we use online personal attacks, it refects our character and not the character of the person we are attacking.

I thought personal attacks were a hallmark of TOP, which is why I left TOP and vguitarforums (although I still read the very informative technical information there at V).
 
When I hear GM800 P2M tracking error from technically proficient players I don't believe your statement is entirely true. In one of the GM800 YT, one such technically proficient guitar player stated that the tracking errors are still there.

The point is they don't have to be with current DSP modeling technically available in software.

Supporting Roland's product repackaging market strategy doesn't help get us a hardware guitar synth that can accurately model acoustic instruments and actually works for everyone.
Proficient guitar playing doesn't always mean clean playing.

Roland has never created a ZenCore based guitar synth prior to the GM-800. ZenCore tech has been around for years but not for guitarist.
 
What's your solution then?
I've only recently learned that acoustic modeling technology is available in software.

I would have purchased the gm800 if it had that technology. I already have a GR50 that works well enough for its internal pcm sample snippets. But it does have one of the better Roland P2M to trigger external midi synthesizers.
 
I'm only confused why anyone would want these translated via MIDI or anything else for that matter???

Because they're part of the guitar's articulation vocabulary.
Not that they'd make much sense all the time - but it also doesn't have to be the full enchilada. On a smaller scale, slurs and double bends are pretty much the same thing - and yes, in case it was possible, I defenitely wouldn't want these to be ported into the land of MIDI. I spent a great deal practising articulation on the guitar and don't want this to be lost. Which, fwiw (and as said before), is why I absolutely dig Roland's work on the VG and SY series - because these actually take guitar articulations into account.

And again and also as said: This is the fundamental difference between guitars and keyboards when used as MIDI controllers.
Pretty much anything you do on a keyboard (no, we don't need esoteric examples of people playing prepared pianos with spoons and what not) translates intp MIDI 1:1. Heck, even if you are a classically trained pianist, you don't need to adjust much (if any) of your core playing technique.
On the guitar however, not only are tons of informations lost, no, they're even causing unwanted side effects, so, as a result you have to adjust your playing technique quite a bit (or just get along with whatever it might be). In addition, the converter units are better off in case they're allowing for some pre-filtering before the signal hits your DAW or synth. There's "data thinning" options for a reason. I still remember the memory overloads on the Atari ST. Without careful playing and data thinning, you could simply run out of note/data memory back in the days. None of that would ever happen with a master keyboard.
 
What's the solution for the guitarist getting into guitar synth today?

Obviously the device discussed in this thread (if you really wanted "true" synthesis from the ground). Or any other similar device for the matter.
But if I was to build a synth-alike guitar rig, I'd likely go for a SY-1000 (maybe even combined with a VG, but these are really showing their age IMO) or get even more adventurous by tweaking the split up hex pickup signal myself (a nightmare to keep track of, but these days there might be better tools to organize the 6 audio streams and possibly tweak them simultaneously if needed).
 
Also name me one of roland / boss guitar synths where you can layer 4 parts?
The 1990's Roland GR50 could layer 4 parts. These were known as as partials and they could consist of PCM sample stippets and /or a subtractive synthesizer algorithm.
Roland has never created a ZenCore based guitar synth prior to the GM-800.
Are those sounds glitch free, like on the low E string? Please tell me you've never had a tracking error with the GM800.
Fishman doesn't really cut it either, as already outlined clearly in this thread.
I've read that many people still prefer the fishman triple play over the GM800, and I believe the latency measurements favor the triple play. And remember that the triple play is triggering any external synthesizer you want including the actual analog synthesizers that the gm80O only offer as sample snippets.

it's good enough for John McLaughlin, but even he knows it's not perfect.
 
The 1990's Roland GR50 could layer 4 parts. These were known as as partials and they could consist of PCM sample stippets and /or a subtractive synthesizer algorithm.

Are those sounds glitch free, like on the low E string? Please tell me you've never had a tracking error with the GM800.

I've read that many people still prefer the fishman triple play over the GM800, and I believe the latency measurements favor the triple play. And remember that the triple play is triggering any external synthesizer you want including the actual analog synthesizers that the gm80O only offer as sample snippets.

it's good enough for John McLaughlin, but even he knows it's not perfect.

GM-800 isn't perfect and little slow on the low E and A but as far as bloops and bleeps and glitches like the gr-55 and other past ones, nope! Once it's set up right I don't get that at all.

Ok so maybe the gr-50 could do partials, but I bet it couldn't layer 4 parts anywhere it wanted too on the fretboard like the gm-800 can? :) Technically 5 parts if i wanted to add the drum partial... And yes I know triple play can do that but I'm talking boss / roland.
 
Because they're part of the guitar's articulation vocabulary.

Serious questions. If pick scrapes, dive bombs, pinch harmonics, and alike were able to be communicated via MIDI (or any other protocol for that matter) to a synth, what would they be used for? What would they translate into? Would they open or close the filter? Increase the resonance? Create a controlled pitch slide? Generate some modulation of vibrato, tremolo, additive harmonic overtone content, or something else? Change the envelopes and how they react? Generate new notes? How would assign such a source of control to a modulation destination within the synth and what range of effect would it be? What amount of bit rate resolution would be needed to effectively make it all work without clogging up the data stream and not getting zipper like reaction to the end expression that is desired? Lastly, what types of synth sounds would benefit from those guitar-centric playing styles?
 
I am trying to figure out when Midi was first developed, why they called it Midi (musical Instrument Digital Interface) and not just call in KDI (Keyboard Digital Interface).
Was there some foresight back then that this tech would also apply to other types of instruments too ?

You can tinker about that all you want - but my "musings", if you will, aren't even about that. It's all about factual things. Pitch-to-MIDI, or rather "string to MIDI" (which is adding quite some further issues) is vastly inferior inherently compared to the super controlled environment of a key sending simple on/off messages plus velocity data. Because that's all that's happening.

Apart from that, it doesn't even need to be mentioned that MIDI serves a lot of other purposes than just transfering keyboard note data.
But that still doesn't make it any more suitable for whatever Guitar-to-MIDI adventures.

In fact, it's not even the fault of MIDI that it doesn't work all that well with guitars, it's rather the guitar itself, which simply wasn't designed to throw out electric control messages (whereas that is what keyboards are built around, even the oldest ones). If it was, it'd possibly look like a Synthaxe or at least feature sensing frets, so you wouldn't have to deal with the burden of having to extract pitch data out of what is an incredibly complexed source signal.
 
Serious questions. If pick scrapes, dive bombs, pinch harmonics, and alike were able to be communicated via MIDI (or any other protocol for that matter) to a synth, what would they be used for? What would they translate into? Would they open or close the filter? Increase the resonance? Create a controlled pitch slide? Generate some modulation of vibrato, tremolo, additive harmonic overtone content, or something else? Change the envelopes and how they react? Generate new notes? How would assign such a source of control to a modulation destination within the synth and what range of effect would it be?

Simple answer (still absolutely serious): Anything you like could basically be possible. As is, nothing is possible.

What amount of bit rate resolution would be needed to effectively make it all work without clogging up the data stream and not getting zipper like reaction to the end expression that is desired?

Data resolution shouldn't be an issue. You can go all mad with synths and record tons of controller information on the same track, all transmitted at once to whatever will create the final sound.
It's all (!) about the actual conversion process. That very process needed to be able to identify the differences between smooth glides and heavy slurs. It needed to identify dead notes as such and not as anything with an assigned pitch (which is what all systems right now are doing). Etc.
In other words: Anything not exactly sounding like a clear note when you play your guitar would have to be translated into controller data (because there's no "advanced note parameter" subset in the MIDI protocol).
As long as these things can't be done (and IMO we're faaar away from anything like that to happen, especially given that all this is a super niche thing anyway).

In addition, the receiving side would likely have to be adjusted to make use of that additional data, too - but on many synths that could be done already, you can basically use any controller information to control anything through direct CC routing, modulation matrixes and what not.

Lastly, what types of synth sounds would benefit from those guitar-centric playing styles?

Anything you want.
In fact, even if you didn't explicitely put any of that additional controller data to dedicated use, everything would still profit. Because in case the guitar signal could be dissected like that (which, as said, I doubt will happen any day soon), the additional information could as well be filtered out a lot easier, resulting in a cleaner signal to arrive at your synths MIDI in.
As an example: You're playing some funky rhythm stuff, including lotsa dead notes. Exactly the kind of stuff that Guitar-to-MIDI is translating pretty badly. Now, in case our super hypothetic new Guitar-to-MIDI converter could dissect the clean note data from the mutes and turn the latter, say, into CC data, you could just ignore that on the receiving end and only use the clean chord data for whatever synths. Could for instance be a 4 part horn section, typical drop 2 chords as used often in funky guitars translate to horns extremely well - and all of a sudden, your funky guitar would be doubled by a horn section (which can sound quite great, been doing that kinda stuff already...) - in realtime, simply because all those pesky dead notes would be ignored by the horn section (which is the trick to make this work).
On other patches you may want to explicitely make use of that additional data. Percussive dead notes could surely work great on any synth patches sounding a bit, say, clavinet-ish (on a realistic sounding clavinet patch they likely wouldn't, but who knows...). But for that to work properly, the converter would have to translate them into additional CC data, too - which you could then use to control, say, an envelope, just so the mutes you play have their sort of equivalent on the synth used.
 
Simple answer (still absolutely serious): Anything you like could basically be possible. As is, nothing is possible.



Data resolution shouldn't be an issue. You can go all mad with synths and record tons of controller information on the same track, all transmitted at once to whatever will create the final sound.
It's all (!) about the actual conversion process. That very process needed to be able to identify the differences between smooth glides and heavy slurs. It needed to identify dead notes as such and not as anything with an assigned pitch (which is what all systems right now are doing). Etc.
In other words: Anything not exactly sounding like a clear note when you play your guitar would have to be translated into controller data (because there's no "advanced note parameter" subset in the MIDI protocol).
As long as these things can't be done (and IMO we're faaar away from anything like that to happen, especially given that all this is a super niche thing anyway).

In addition, the receiving side would likely have to be adjusted to make use of that additional data, too - but on many synths that could be done already, you can basically use any controller information to control anything through direct CC routing, modulation matrixes and what not.



Anything you want.
In fact, even if you didn't explicitely put any of that additional controller data to dedicated use, everything would still profit. Because in case the guitar signal could be dissected like that (which, as said, I doubt will happen any day soon), the additional information could as well be filtered out a lot easier, resulting in a cleaner signal to arrive at your synths MIDI in.
As an example: You're playing some funky rhythm stuff, including lotsa dead notes. Exactly the kind of stuff that Guitar-to-MIDI is translating pretty badly. Now, in case our super hypothetic new Guitar-to-MIDI converter could dissect the clean note data from the mutes and turn the latter, say, into CC data, you could just ignore that on the receiving end and only use the clean chord data for whatever synths. Could for instance be a 4 part horn section, typical drop 2 chords as used often in funky guitars translate to horns extremely well - and all of a sudden, your funky guitar would be doubled by a horn section (which can sound quite great, been doing that kinda stuff already...) - in realtime, simply because all those pesky dead notes would be ignored by the horn section (which is the trick to make this work).
On other patches you may want to explicitely make use of that additional data. Percussive dead notes could surely work great on any synth patches sounding a bit, say, clavinet-ish (on a realistic sounding clavinet patch they likely wouldn't, but who knows...). But for that to work properly, the converter would have to translate them into additional CC data, too - which you could then use to control, say, an envelope, just so the mutes you play have their sort of equivalent on the synth used.

May I ask, have you ever created from scratch your own new patches in a complex synth like ZenCore or even a simple one like a MiniMoog? Much, if not almost all, of what you are hoping for isn't available within these synths and that has nothing to do with MIDI.

Look at the knobs, switches, and controls on this MiniMoog and consider where the desired guitar-centric applications may be applied. There isn't much realistically to go with at the end of the line that isn't already available without all that guitar stuff.

iu


ZenCore/Zenology isn't much more accessible in terms of usable and/or useful destinations to control in the way desired. Most of the basic complexity in ZenCore (and also other complex synths) come from the basic building blocks that generate the core sound. That is, the waveforms of the oscillators or samples and their cross modulation, filtering types, envelope settings, and effects. All of these available parameters for end modulation are already driven by note number, velocity, pitch bend, mod wheel, aftertouch, and/or expression pedal. Obviously, a guitar doesn't have a mod wheel or aftertouch. But, the GM-800 does have control switches and expression pedal inputs that can be assigned to these as desired.

iu


Point is, what can be done within a synth already exist with current tools. It is conceivable new synths or upgrades could apply guitar-centric controls. But right now, there is no there... there. Again that doesn't have anything to with MIDI.

Lastly, the market for this guitar synth stuff is incredibly tiny relative to the guitar and synth markets overall. It's admirable what Roland has brought to the table and the resources they've already made available to us. Use it for the most of what it is rather than object to what it isn't.

Cheers, TJ
 
Back
Top