'FX Loop' Effect Module

Simply providing send and return audio ports, a wet/dry dial and pre/post gain dials.

For routing audio through external effects units!

Oscar

Comments

  • edited August 2017
    Other than the pre/post, doesnt this already exist ?
    You can send audio from aux tracks to any audio output, and any audio track to any input.

    Maybe an effect that attachs direct to audio ins and outs with a ping feature like desktop DAWs ?
    Do IOS AUv3 even support PDC, could this even be done well on IOS, i have no idea about these things.
  • Quoting @brambos here from October 2016

    "The AUv3 standard supports it. There's a 'latency' property which can (in theory) communicate a plugin's built-in processing latency to the host. I'm not sure if any host implements it."

    So yeah, lets hope they can push forward with this :)
  • @5pinlink said:
    Quoting @brambos here from October 2016

    "The AUv3 standard supports it. There's a 'latency' property which can (in theory) communicate a plugin's built-in processing latency to the host. I'm not sure if any host implements it."

    So yeah, lets hope they can push forward with this :)

    Keep in mind that the AUv3 standard is identical across MacOS and iOS, but not all features available on MacOS are also implemented in iOS. Notably some 'plugin types' (MIDI Generator, etc.) are not supported on iOS. Could be a matter of host implementation, but maybe some features require OS/framework updates as well.

  • @5pinlink I send/return in this way at the moment, but (if my logic on how things 'should' work is correct) if you could just stick the hardware in an effects loop on an AUX track, it should export to the final mix without having to bounce it down to an audio track in real time.

    Also, latency compensation as you mentioned would be fantastic. I'd definitely (at minimum) like the ability to 'offset' a track or audio clip in time by a marginal amount of time, for this purpose (send/return FX) but just having it baked in would be lovely.
  • edited August 2017

    @OscarSouth
    Export to final mix will never work unless the final mix is a 1:1 realtime render and not an offline render, that would need splitting out in to a separate FR.
    If at all possible i would always suggest a bounce, just for project/session archiving.

    @brambos
    So is the latency tag not in the IOS AUv3, or it just isn't supported by hosts/plugin developers yet ?
    Obviously it is less important in this case, because the BM3 developers could easily implement their own PDC for their native effects, which this would be.

  • edited August 2017
    @5pinlink ah yeah you're definitely (and plainly logically) right. Maybe a real time render option then! (Ok, I'm going quite far for slightly more convenience).

    I actually tried to do a real time bounce of the master track to an audio channel before now, but I could only choose a bank/pad as an internal input.

    If it was possible to choose the master track as an internal input then audio tracks acting as effects returns could be set to monitor and what I'm talking about would be theoretically possible, providing the monitored tracks would also be recorded through the internal master track routing.
  • I think there is a 1:1 realtime render FR already, would need to check on that though.

  • edited August 2017

    What you talk about would only theoretically be possible in Dr Whos reality haha.
    Master track as input, then record would be recording itself and feedback would tear your ears apart.

    What you want is a 1:1 realtime record so that you can route external inputs through internal processing, then have that 1:1 recording added to an audio track, am i right ?

  • @5pinlink said:
    So is the latency tag not in the IOS AUv3, or it just isn't supported by hosts/plugin developers yet ?
    Obviously it is less important in this case, because the BM3 developers could easily implement their own PDC for their native effects, which this would be.

    I don't know. I never looked into it, but it could be. Just wanted to point out that it being in the standard is no guarantee it will be available on iOS at this moment.

  • edited August 2017
    @5pinlink ah yeah haha. I'm typing and walking .. I guess my brain can't fit in thinking along with those other two.

    Surely it must be possible in some simple way to render an external hardware effects return in real time though?
  • Yeah, it is just a 1:1 export to audio option, so rather than rendering offline, it plays thriugh the timeline and records everything.

    Definitely needs its own feature request thread though, so it doesnt get lost ;)
Sign In or Register to comment.