Why Is There a Big Discrepancy Between Monitored Audio Tracks and Recorded Tracks?
edited April 2018 in General
I have been noticing some major volume issues in Beatmaker 3 that I have not seen in any other DAW. It is hard for me to pinpoint what is going on, but I will get everything set up and the guitar and it sounds good, then all of a sudden it comes through my headphones way too loud when I start to record. Any ideas?
There are a couple of audio settings - pad layer volume and audio preview volume. Maybe check there first?
One thing I just noticed... I have a guitar and a bass. I am using an audio interface on multiple tracks. I seem to have to disable the audio interface on one of the tracks to avoid getting feedback from the other track even though it is not armed and the monitor is not on.
The other thing is, the signal level monitored doesn't match the signal recorded. So if I turn the audio down so its not too loud, it records too quiet. I have never seen this in a DAW before. Usually the volume you hear when you record is the same volume you hear when you play it back; but, here, there is a dramatic difference.
Turn monitor off in the mixer, sounds like you are monitoring via your audio interface (Hardware) and through B3s mixer (Software) this is exactly what happens in all DAWs when you use hardware monitoring and software monitoring at the same time.
You are basically hearing the input signal twice, so it seems too loud.
This doesn't make sense. I have an Apogee Jam. There is only one knob on the interface -- for volume in. There is no external audio monitor, my headphones are connected to the iPad.
You recommended turning off monitor in the mixer in BM3? How can I hear myself play then? I just tried this and I was able to record without hearing what I was recording, which is weird enough, but then when I played back the recording it was also insanely loud.
As I wrote this I was trying to write it politely, so I mean all of this politely. It just doesn't make any sense to me.
I deleted a bunch of stuff and reloaded. Got some better results without changing anything, it seems buggy. Maybe it is the AUv3. I will try and figure it out in the morning.
With the information you provided originally, it made perfect sense, most audio interfaces allow you to hardware monitor as an option, if you hardware monitor AND software monitor at the same time via B3s mixer you will double your monitoring volume, but only half will be recorded.
Now i know it is just an Apogee Jam which is an aggregation device rather than a full audio interface it is obvious you only have software monitoring.
Do you have any FX on the track? I found to my surprise that BM3 records the effected post-fader output of the channel, whereas every other DAW I have records the dry signal and then applies the FX on playback (and monitoring), then offers a Freeze Track option.
So, I found that if I had FX on the channel, the effected signal was then fed back through the FX again. Guitar fed through ToneStack fed back into ToneStack could be good for something, but nothing that I’m trying to do.
What I do now is set up two tracks. The first has no FX, volume set to just below 0db peaks, is armed, and is not monitored. This is to capture the dry signal.
The second channel’s input source is set to the first channel and is monitored but not armed.
Once I’ve recorded the dry track, I mute it. Playback comes through the second (wet) track. I can adjust to my heart’s content, then when I’m ready to “freeze” the wet track, I record the audio, then disable or remove the FX, and disconnect the audio routing from the first track.
Seems a lot more convoluted than it needs to be, and you also end up with a bunch of dry tracks that you don’t need cluttering up the list.
Could be there’s an easier way. I ran out of patience for looking once I found something that could get the job done.
@number37 Yeah I think it has something to do with it for sure. I really want to see a few simple functions worked out in the next release. I don't know why developers seem to miss the fact that niche markets hold tremendous opportunity for profit. Either Auria or Cubasis opens up room for better electronic music production, or BM3 et all. opens up functionality for people who integrate real instruments when the produce.
The fact is, laying down a real bassline on a track -- especially live -- is really awesome. @mathieugarcia I want hard set loop pattern lengths for audio, so I can just keep playing without having to stop everything I am doing to go into the sample editor, change the length of the loop back to 8 bars and then trim a sample.
Someone needs to sit down with their ipad and bass or guitar, as well as a keyboard and start playing in order to understand what needs to be done to make that experience fast and efficient. Once that happens, BM3 becomes the leader in performance production. It is so close to being there. So close...
If you want to monitor wet and record dry you can.
You create a second track with no effects, you record that one unmonitored, and monitor but do not record the one with the effect on.
This way you have the option to record wet or dry.
Best case scenario, I would just like to put some effects on my audio channel, record what I play when I am listening to all the other tracks playing and have that exact level I hear be the level that gets recorded. I have never had an issue with this in any other DAW, iOS or OSX or Windows -- never seen this before. I am just craving that simple fluidity of being able to work in a scene mode and have locked bar lengths and level matching. I have never seen a DAW that has odd bar lengths either -- especially in a scene or clip mode. I love BM3 -- so pumped on it.. It is fantastic, well written, love the graphics. But these two things pretty much always cause me to have to stop BM3 and start editing, which means I can't perform live with it.
An iPad is unpredictable anyway for live performance -- I understand the pitfalls of that, but having to trim audio gets in the way of jamming creatively to build up a project.