From what I’ve seen, it’s currently not possible to manually program or assign specific MIDI messages directly inside Genome, so MIDI Learn may be the only option available within the plugin itself.
If you’re using Gig Performer, though, automations might be a better and more flexible solution. You can program automations inside each Genome preset and then map them to widgets in your rackspace. This lets you store different sounds and parameter combinations per rackspace variation, effectively creating a kind of “Scenes” workflow, which Genome doesn’t natively support yet.
If you really need to use MIDI with Genome (for example, to control more than the 10 parameters allowed by automations), I usually do one of the following:
- Direct Midi to Genome
Make sure your MIDI input is connected to the Genome block. You can use either a specific controller input block or the OMNI MIDI In if you want all incoming MIDI to reach Genome.
With this setup, MIDI should reach Genome and work with MIDI Learn. The downside is that you can’t store different MIDI activation states as rackspace variations.
View attachment 56284
- Use Gig Performer widgets to send MIDI internally
Another option is to create widgets in your rackspace and map them to the MIDI commands exposed by the Genome plugin (there are both CC and PC options available, see screenshot below).
You can then map those widgets to your hardware controller buttons or knobs. This gives you more flexibility, since you can easily remap things later without touching Genome itself.
View attachment 56286
That said, I personally find automations easier and more reliable than sending MIDI directly to Genome, at least for most use cases.