Atomic Tonocracy (Inc NAM support)

NAM is Python-based (PyTorch), which means it cannot be made to easily run on the normal ARM or SHARC platforms most modelers use - not without non-trivial rework, at the least.
No idea, as I’m a sysadmin not a programmer. But I do know they’ve got it running on the Mod Duo. I have no idea what proc that uses though; I’ve never used one.

EDIT

 
No idea, as I’m a sysadmin not a programmer. But I do know they’ve got it running on the Mod Duo. I have no idea what proc that uses though; I’ve never used one.

Yeah, the so called "nano" models. These alone still eat up something like ~70% of the DSP on a Mod Duo, which is built around a single AllWinner A20 ARM SoC board :(

For reference, the ToneMaster Pro and Headrush Prime family use similar architectures.

EDIT: ...and the Poly Beebo mentioned below.
 
No idea, as I’m a sysadmin not a programmer. But I do know they’ve got it running on the Mod Duo. I have no idea what proc that uses though; I’ve never used one.

EDIT

And the Beebo:

Nano models as well.
 
NAM is open source, so I don't think there's a big problem there. Some other hardware already supports it, like the Mod Duo.
NAM itself is, but there may be 3rd party captures involved in presets that may not vibe with being converted to other formats. Probably fine if it’s just “personal use” situations but that shit is never straight forward
 
Well the NAM captures run in the Tonocracy app on our computers. So are they running the PyTorch stuff?
And wouldnt the Tonocracy hardware be able to use whatever is optimal to support the same functionality that our computers are providing be it PyTorch or whatever the code is in our machines the app runs in? I’m obviously ignorant to code etc but it seems like a non problem.

Let’s see Atomic stick to a plan for longer than six months before getting all excited about hardware possibilities
I feel your skepticism but Tom King has more to his resume than bad support episodes. Chances are he isn’t stupid and isn't alone in this so I remain hopeful. See me in a few months though and I might be passing out pitchforks and torches.
 
Well the NAM captures run in the Tonocracy app on our computers. So are they running the PyTorch stuff?
And wouldnt the Tonocracy hardware be able to use whatever is optimal to support the same functionality that our computers are providing be it PyTorch or whatever the code is in our machines the app runs in? I’m obviously ignorant to code etc but it seems like a non problem.


I feel your skepticism but Tom King has more to his resume than bad support episodes. Chances are he isn’t stupid and isn't alone in this so I remain hopeful. See me in a few months though and I might be passing out pitchforks and torches.
Right, there are also loads of promised firmware updates, there are AA3, AA6, AA12, and Amplifire-Boxes that, while awesome products, are also effectively EOL at this point in terms of future updates (hopefully they continue to have parts on hand for repairs?!?).

I’m not saying this is going to wind up in the same sort of limbo-land-by-today’s-standards as those products. But I am absolutely about as certain that my kid is going to bring his empty dr. Pepper can downnout of his room before bedtime without me having to ask as I am that this is a new page that is different in kind than all the other previous new pages. Which is…a bit less than 50/50.
 
Last edited:
Right, there are also loads of promised firmware updates, there are AA3, AA6, AA12, and Amplifire-Boxes that, while awesome products, are also effectively EOL at this point in terms of future updates (hopefully they continue to have parts on hand for repairs?!?).

I’m not saying this is going to wind up in the same sort of limbo-land-by-today’s-standards as those products. But I am absolutely about as certain that my kid is going to bring his empty dr. Pepper can downnout of his room before bedtime without me having to ask as I am that this is a new page that is different in kind than all the other previous new pages. Which is…a bit less than 50/50.
In his defence I used to passively over weeks build coke can pyramids on my desk while playing counterstrike after school because I couldnt be bothered taking them downstairs, time to make some art.
 
And wouldnt the Tonocracy hardware be able to use whatever is optimal to support the same functionality that our computers are providing be it PyTorch or whatever the code is in our machines the app runs in? I’m obviously ignorant to code etc but it seems like a non problem.

TL;DR, no, not currently. NAM is very CPU intensive, in a way which DSP hardware used on modelers is not designed for.

There's ongoing work to bridge that gap thou.
 
NAM is Python-based (PyTorch), which means it cannot be made to easily run on the normal ARM or SHARC platforms most modelers use - not without non-trivial rework, at the least.

Well the NAM captures run in the Tonocracy app on our computers. So are they running the PyTorch stuff?
And wouldnt the Tonocracy hardware be able to use whatever is optimal to support the same functionality that our computers are providing be it PyTorch or whatever the code is in our machines the app runs in? I’m obviously ignorant to code etc but it seems like a non problem.
PyTorch is a machine learning framework which you could use to do anything from identifying if a picture is of a cat to making captures of a guitar amp. It's only used for creating the models. It's the processing heavy part, which is why e.g Toneocracy/Tonex/NAM approach of "capture it on your computer or in the cloud" is right for this.

Running those created models is entirely possible on those other platforms. Tonex is also PyTorch-based, just different setup from NAM. Something like Poly Beebo running NAM models is an example how the final model doesn't require a lot of processing power to run and can be ported to work on ARM/SHARC platforms.
 
As I understand it, most ARM implementation has a Linux framework behind it, and it is fairly straight forward to get Python and additional libraries running on Linux. So I don't think it would actually be a huge amount of work to get whatever is needed up and running on ARM; a competent dev team could probably get a prototype up and running in 4 weeks.
 
As I understand it, most ARM implementation has a Linux framework behind it, and it is fairly straight forward to get Python and additional libraries running on Linux. So I don't think it would actually be a huge amount of work to get whatever is needed up and running on ARM; a competent dev team could probably get a prototype up and running in 4 weeks.
Python on ARM is not a problem to my knowledge, it seems to be already working but is not as straightforward as "just install it". PyTorch on ARM chips in a pedal etc would run into processing performance limitations so it doesn't make a lot of sense to try to do that.

Running the actual models is already shown to work just fine though. That's what the Tonex pedal is.
 
Love me some Atomic Amps, but i stopped caring about plugins a long time ago

Same here. I was sooo much into plugins for a while that I thought about building a laptop based live rig (doing it properly would easily exceed the cost of even some highend standalone modelers, but still, I loved the idea...).That has changed dramatically, even so much that I'm pretty much not using amp plugins at all anymore. The reason possibly being that I just like haptic feedback, WYSIWYG and what not (which is why I also don't like using editors much anymore).
Having said that, personally I'd like to see an extended take on an Amplifirebox, maybe a dual channel version with all parameters per channel exposed just as on the current models - with NAM support, that'd be exactly what'd float my boat. Not sure I'd feel too happy about buying from Atomic, though, after all, support has always been shady and in mainland Europe pretty much non-existant (had to buy my AFB at Andertons).
 
Same here. I was sooo much into plugins for a while that I thought about building a laptop based live rig (doing it properly would easily exceed the cost of even some highend standalone modelers, but still, I loved the idea...).That has changed dramatically, even so much that I'm pretty much not using amp plugins at all anymore. The reason possibly being that I just like haptic feedback, WYSIWYG and what not (which is why I also don't like using editors much anymore).
Same here. There's a lot of integration in the hardware units that you start to miss when you have to build all of that yourself. Even on a pedalboard the first thing I miss is how easy it is to setup scenes on Fractal to change multiple things. Even though that's now relatively easy to do on my Luminite Graviton M1 + Strymon pedals rig, it's still not as easy as it is on Fractal or Helix.

Unless you can use just one comprehensive plugin like Helix Native, patching together multiple plugins each with their own UIs, varying feature sets for e.g MIDI control etc is a bit of a pain in the ass even in something like a VST host app.

A friend of mine does a one man keyboard + drum machine + vocals band thing using a top tier gaming laptop (because that was the most powerful thing at the time) and he told me it took him something like a year to configure it to work to how he likes to use it.
 
Unless you can use just one comprehensive plugin like Helix Native, patching together multiple plugins each with their own UIs, varying feature sets for e.g MIDI control etc is a bit of a pain in the ass even in something like a VST host app.

I have once been doing that and it worked. Even quite well. And I think I do have the patience to do the software puzzle thing.
But, with a laptop on stage there's so much more than just the software side of things.
Where do you place the laptop? And where do you place the audio interface? Do you *really* want to rely on shady USB connectors live (spoiler: no way, at least not without securing them big time on both ends)? What MIDI controller? And what not.
Things have gotten better but a laptop is still a rather fragile thing to carry around and place on stages. And unlike keyboarders, as a guitar player you're usually exposed quite a bit more.
If I would go for any such a setup today, I'd likely get a Mac Mini (even the cheapest entry level version should do, maybe with 16GB/RAM) and remote control it from an iPad (AFAIK, that's perfectly possible without ever connecting anything else). With a bit of tinkering, Mac Mini and interface could perhaps even be placed inside a double tier pedalboard (Schmidt-Array kinda style), and as there's no critical moving parts inside the computer anymore, that should be a pretty safe thing to operate, too.
I'm sure I could slap a great sounding and performing setup together that way (Mainstage would likely be sufficient as a host), but as I would love to have many knobs (surprise, surprise) and ideally scribble strip kinda things, the entire setup would cost a whole lot of money.

As a result, I would only go for something like that in case I wanted to do some stuff otherwise pretty much impossible, such as intense looping, synth-style sounds, individual string processing of a hex pickup or whatever. For any typical guitar setup, it'd be too much of an effort, especially given that there's so many nice ready to roll things out there.
 
I have once been doing that and it worked. Even quite well. And I think I do have the patience to do the software puzzle thing.
But, with a laptop on stage there's so much more than just the software side of things.
Where do you place the laptop? And where do you place the audio interface? Do you *really* want to rely on shady USB connectors live (spoiler: no way, at least not without securing them big time on both ends)? What MIDI controller? And what not.
Things have gotten better but a laptop is still a rather fragile thing to carry around and place on stages. And unlike keyboarders, as a guitar player you're usually exposed quite a bit more.
If I would go for any such a setup today, I'd likely get a Mac Mini (even the cheapest entry level version should do, maybe with 16GB/RAM) and remote control it from an iPad (AFAIK, that's perfectly possible without ever connecting anything else). With a bit of tinkering, Mac Mini and interface could perhaps even be placed inside a double tier pedalboard (Schmidt-Array kinda style), and as there's no critical moving parts inside the computer anymore, that should be a pretty safe thing to operate, too.
I'm sure I could slap a great sounding and performing setup together that way (Mainstage would likely be sufficient as a host), but as I would love to have many knobs (surprise, surprise) and ideally scribble strip kinda things, the entire setup would cost a whole lot of money.

As a result, I would only go for something like that in case I wanted to do some stuff otherwise pretty much impossible, such as intense looping, synth-style sounds, individual string processing of a hex pickup or whatever. For any typical guitar setup, it'd be too much of an effort, especially given that there's so many nice ready to roll things out there.
For me using a laptop on stage would not be a huge problem. I've seen several Synthwave bands rocking that sort of rig even for guitar tones and it was fine. I'd treat it sort of like a rack rig - computer and audio interface out of the way, MIDI controller somewhere closer to the front of the stage.

I don't think remote controlling a Mac Mini with an iPad would work that well. I sometimes use MacOS Sidecar feature to use an older iPad Pro as a second display and it can be flaky - sometimes it gets stuck after return from sleep or might fail if the connection is not good. And it does not fail gracefully at all! A big problem with it is also that you cannot do anything but scroll with your finger. Clicking on anything requires using an Apple Pencil. I know, it's insane, right?
 
Back
Top