'FX Loop' Effect Module
Simply providing send and return audio ports, a wet/dry dial and pre/post gain dials.
For routing audio through external effects units!
Oscar
For routing audio through external effects units!
Oscar
Comments
You can send audio from aux tracks to any audio output, and any audio track to any input.
Maybe an effect that attachs direct to audio ins and outs with a ping feature like desktop DAWs ?
Do IOS AUv3 even support PDC, could this even be done well on IOS, i have no idea about these things.
"The AUv3 standard supports it. There's a 'latency' property which can (in theory) communicate a plugin's built-in processing latency to the host. I'm not sure if any host implements it."
So yeah, lets hope they can push forward with this
Keep in mind that the AUv3 standard is identical across MacOS and iOS, but not all features available on MacOS are also implemented in iOS. Notably some 'plugin types' (MIDI Generator, etc.) are not supported on iOS. Could be a matter of host implementation, but maybe some features require OS/framework updates as well.
Also, latency compensation as you mentioned would be fantastic. I'd definitely (at minimum) like the ability to 'offset' a track or audio clip in time by a marginal amount of time, for this purpose (send/return FX) but just having it baked in would be lovely.
@OscarSouth
Export to final mix will never work unless the final mix is a 1:1 realtime render and not an offline render, that would need splitting out in to a separate FR.
If at all possible i would always suggest a bounce, just for project/session archiving.
@brambos
So is the latency tag not in the IOS AUv3, or it just isn't supported by hosts/plugin developers yet ?
Obviously it is less important in this case, because the BM3 developers could easily implement their own PDC for their native effects, which this would be.
I actually tried to do a real time bounce of the master track to an audio channel before now, but I could only choose a bank/pad as an internal input.
If it was possible to choose the master track as an internal input then audio tracks acting as effects returns could be set to monitor and what I'm talking about would be theoretically possible, providing the monitored tracks would also be recorded through the internal master track routing.
I think there is a 1:1 realtime render FR already, would need to check on that though.
What you talk about would only theoretically be possible in Dr Whos reality haha.
Master track as input, then record would be recording itself and feedback would tear your ears apart.
What you want is a 1:1 realtime record so that you can route external inputs through internal processing, then have that 1:1 recording added to an audio track, am i right ?
I don't know. I never looked into it, but it could be. Just wanted to point out that it being in the standard is no guarantee it will be available on iOS at this moment.
Surely it must be possible in some simple way to render an external hardware effects return in real time though?
Definitely needs its own feature request thread though, so it doesnt get lost