quantize off not playing back exactly what I record major fix needed

edited December 2017 in Support
beats are to be made exact to whats played- major major "quantize off" flaw.
Also need to hear real time quantize changes while track is playing when editing notes in pattern edit to hear the quantie changes before applying them i.e while adjusting swing change in highlited notes.
Also to add - a quantize per pad is needed.
A nudge button of selected notes in pattern edit would also be ideal.
«1

Comments

  • edited December 2017
    By Quantize off, do you mean quantize is turned off, and the recording is wrong, or the auto quantize is changing when the notes turn off ?

    Quantize works on selected pads, select one pad and it will only quantize that pad.
    Realtime/non destructive Quantize requested here
    https://intua.net/forums/discussion/5640/iterative-quantization#latest
    Note nudeg requested here
    https://intua.net/forums/discussion/5148/nudging-snares-claps-sample-at-a-time?new=1
     
  • ok thanks - i think i know what you mean - will try that "select" option..the "quantize off" though is bugged out.
    cheers though .
  • By Quantize off, do you mean quantize is turned off, and the recording is wrong, or the auto quantize is changing when the notes turn off ?
  • quantize is turned off and the quantize is wrong - not playing back exactly what iam recording when ive recorded with quantize off.
  • 5pinlink said:
    By Quantize off, do you mean quantize is turned off, and the recording is wrong, or the auto quantize is changing when the notes turn off ?

    Quantize works on selected pads, select one pad and it will only quantize that pad.
    Realtime/non destructive Quantize requested here
    https://intua.net/forums/discussion/5640/iterative-quantization#latest
    Note nudeg requested here
    https://intua.net/forums/discussion/5148/nudging-snares-claps-sample-at-a-time?new=1
     

    5pinlink said:
    By Quantize off, do you mean quantize is turned off, and the recording is wrong, or the auto quantize is changing when the notes turn off ?

    Quantize works on selected pads, select one pad and it will only quantize that pad.
    Realtime/non destructive Quantize requested here
    https://intua.net/forums/discussion/5640/iterative-quantization#latest
    Note nudeg requested here
    https://intua.net/forums/discussion/5148/nudging-snares-claps-sample-at-a-time?new=1
     
    cheers dude :-)
  • 5pinlink said:
    ok - iam not getting latency though -sounds are triggering as i tap pad .
  • @beatmakerstorm I think that's what @bcrichards means. you hear the audio just fine, but the recording comes out not as you would expect. would you say the notes are recorded to the pattern with a slight delay from when you triggered them? or does it seem less consistent, seemingly snapping back or forward to some kind of ghost quantize point?
  • edited December 2017
    ronji said:
    @beatmakerstorm I think that's what @bcrichards means. you hear the audio just fine, but the recording comes out not as you would expect. would you say the notes are recorded to the pattern with a slight delay from when you triggered them? or does it seem less consistent, seemingly snapping back or forward to some kind of ghost quantize point?

    @ronji snapping back or forward to some kind of ghost quantize point - yes
    bm3 does not play back exactly what I play wihen i set quantize off.
    If poss can you do a test to see if you get same issue.
    The quantze ON is ok just the Quantize OFF that's not.

  • edited December 2017
    @beatmakerstorm I just did a quick test (which I don't want to share at the moment, because I'm terrible at finger drumming) and I do believe there's some kind of offsetting happening to the recorded notes. dunno if it's quantize or latency. I set up a new session, bank 1 has hihats every 1/8th note for timing reference, bank 2 has my amen kit, and then I have two audio tracks. I set audio track 1 to record the audio from bank 2, and bank 2 also recorded its own midi. I then recorded as I played through 4 bars of finger drumming on bank 2 (on the ipad screen, no controller), then I let it loop back and play what was recorded. I then cut the recorded audio at the 4 bar marks, moved over the first part (live audio recording) to line up with the midi recorded on bank 2, and I dragged the second part (audio recorded from the midi notes) out of the track helper of audio track 1 into audio track 2 to line up with everything. I did have to shift the start and end points for this "pasted" audio because it didn't do what I expected. now if I mute bank 2 and play the live recording in audio track 1 along with the midi recording in audio track 2, I can definitely hear phasing, but no trainwreck sound I would expect to hear from badly offset notes. If I mute audio track 1 and play the recorded midi notes along with the midi recording in audio track 2, it just sounds loud, so this supports that the recorded midi and the audio from the recorded midi are lined up perfectly.

    anyone else wanna have a go at this, and maybe share a recording of it? =)
  • edited December 2017
    ronji said:
    @beatmakerstorm I just did a quick test (which I don't want to share at the moment, because I'm terrible at finger drumming) and I do believe there's some kind of offsetting happening to the recorded notes. dunno if it's quantize or latency. I set up a new session, bank 1 has hihats every 1/8th note for timing reference, bank 2 has my amen kit, and then I have two audio tracks. I set audio track 1 to record the audio from bank 2, and bank 2 also recorded its own midi. I then recorded as I played through 4 bars of finger drumming on bank 2 (on the ipad screen, no controller), then I let it loop back and play what was recorded. I then cut the recorded audio at the 4 bar marks, moved over the first part (live audio recording) to line up with the midi recorded on bank 2, and I dragged the second part (audio recorded from the midi notes) out of the track helper of audio track 1 into audio track 2 to line up with everything. I did have to shift the start and end points for this "pasted" audio because it didn't do what I expected. now if I mute bank 2 and play the live recording in audio track 1 along with the midi recording in audio track 2, I can definitely hear phasing, but no trainwreck sound I would expect to hear from badly offset notes. If I mute audio track 1 and play the recorded midi notes along with the midi recording in audio track 2, it just sounds loud, so this supports that the recorded midi and the audio from the recorded midi are lined up perfectly.

    anyone else wanna have a go at this, and maybe share a recording of it? =)

    ok thanks for going through in depth test - it is a major flaw - i do like bm3 but INTUA have released a buggy product which I have bought with my money. Not good enough really to release a product and then go over multiple bugs after being bought by the customer. A few bugs ok but there are plenty and many missing features . QUANTIZE - whether set ON or OFF is the most important feature of any groovebox/sampler - who tested their BM3 beta before release ......to not realise notes you record with quantize off dont play back as recorded during testing the beta ....

  • Contact Apple, the product doesn't work as described, get your money back ;)
  • edited December 2017
    5pinlink said:
    Contact Apple, the product doesn't work as described, get your money back ;)

    5pinlink why u saying that. i decide mate.
    I do appreciate you helping out on the forum but the statement you just posted is a bit bold. I like bm3 just those quirks need to be updated.
  • Don’t take offense, @beatmakerstorm, I’m sure @5pinlink was aiming to save you from any more grief if the current state of the app isn’t up to par for you, as you seemed to be getting a little upset about all this. Sometimes people would rather get their money back and they may not know that’s possible. The quirks will get worked out, and hopefully sooner rather than later.
  • ronji said:
    Don’t take offense, @beatmakerstorm, I’m sure @5pinlink was aiming to save you from any more grief if the current state of the app isn’t up to par for you, as you seemed to be getting a little upset about all this. Sometimes people would rather get their money back and they may not know that’s possible. The quirks will get worked out, and hopefully sooner rather than later.

    no worries dude  -just dont want INTUA seeing that and thinking I dislike bm3 - the reason i point out a few things is its a great app and want to see it be no1 in all aspects.
  • You said it wasn't good enough, so you have the option to get a refund.
    It is great to get involved with software, but if it is distressing you like it seems it is, it is time to move on, life is not long enough and you should be making music not getting stressed ;)
  • edited December 2017
    Yearly wipeout.
  • edited December 2017
    The problem is we are dealing with perception, unless you take a recording on a separate recorder that records your performance and the backing track as a single audio track (cassette, portable digital, anything realtime) to compare to what Beatmaker is recording, then everything being stated is unquantifiable.

    I am not saying that Beatmaker is not making a mess, but somebody saying it is so, does not make it so, the MPC4000 having more groove than the 2500 (debate still flares up even now) every saturday night another person singing karaoke convinced they are in key and on beat (we have all seen them) all perception, that is why i suggested getting a refund, if its not working, life is too short.

    If you want to get the issue to intua to fix, you need the recordings to compare or they have no guide stick to work off.

    (Note here i am not making any karaoke remarks about the issue posted here, or the poster of said issue, it could well be a very very serious issue that the rest of us haven't noticed yet)
  • BM3 (and BM2) need to go down to ‘Samples’ measurements. 
    Which is ‘almost’ free..
    (a bit like Retina display)

    ___

    King

    ..
  • i'd love to see a video of someones chops out swing 960 PPQN  :)


  • edited December 2017
    Yearly wipeout.
  • What I've learned over the years is that my personal 'timing' is far from perfect and that setting the BPM correctly in any sequencer plays a huge part in getting things on spot especially when the resolution of the sequencer gets higher. I mean with a lowish resolution like Cubasis 48ppqn with quantise turned OFF I can still land most of the notes on the right spot but with Logic and it's 960ppqn the timing is off by a mile when zooming in to tick resolution.

    What bugged me most in BM3 before was that the end of the short notes was quantised(meaning shortest note that could be recorded was the quantise value). Thankfully that's fixed now and cna be optionally turned on/off depending on the content that is recorded. (For 'One Shots' this doesn't matter but when 'Hold' is used as the trigger mode it's quite evident that this feature was badly needed).

    I do have to give Apple credit for GarageBand and it's non-destructive quantise features.
  • edited December 2017

    this is a puzzling thread.  i mean what is 960 ppq in milliseconds,  is it 0.5ms?? 
    edit. that's at 120bmp. that's quantum time there ;)

  • lots of great info in this thread! so, ppqn = pulses per quarter note, so it's not a set amount of time but based on the tempo of the project. if you wanted to double the ppqn superficially, you could double the tempo of the project and perform in half time, for example (from what I've googled). I never knew about any of this, but it makes sense with my test. phasing between the recorded audio of my performance and the midi recorded audio means the offset is very slight. maybe it gets worse with a more complex project (my test was in a very minimal project) but as others have said, this is the way it goes with software in most situations.
  • edited December 2017
    yeah, may calculation, if it's right, was for 120 bpm. 

  • edited December 2017
    At 120 BPM 4:4 signature
    Quarter note or 1 beat is 60/120 (number of seconds in a minute divided by number of beats in a minute) 0.5ms
    So a single pulse at 120 4:4 is 0.5/996 0.0005020080ms
    Average sense to brain perception is close to 40ms, which we normally halve to cover multiple senses or interaction, this gives us the universally accepted 20ms ear reaction time that most consumer tech will be based on, trained ear tech (musicians, gamers etc) will try to halve this again because we will notice timing changes in the 6-10ms threshold if it is a continual part of our daily lifestyle, however never underestimate a good oldschool DJ who generally will be in the 3-6ms threshold. (Not digital DJ who rely on auto BPM and such, there is a huge huge difference)
    0.0005020080ms you have no way of perceiving one pulse to the next, this is a human impossibility (but ppqn was never about tighter timing ;) )

    FYI i was born deaf, so got a bit of an interest in this stuff years ago, after all my operations as a child, i went on to have a fascination with all this stuff, i haven't been tested for a while but a few years back i was in the 0-2ms threshold (extremely rare) now with age i suspect that has lowered a little.

    Oh and don't be thinking that smaller threshold is always better, if i program strings from a sampler like kontakt, i "Have" to record to stems and then edit the stems to sound like they attack in the right places, everybody listening on thinks i am a complete freak and in blind tests only i can tell the difference.
    I have wasted too many hours fixing things only i can hear :(
  • Nick Herbert, Elemental Mind;

    How finely can we divide our little 3-second lives? The shortest perceivable time division – sensory psychologists call it the fusion threshold – is between 2 and 30 milliseconds (ms) depending on sensory modality. Two sounds seem to fuse into one acoustic sensation if they are separated by less than 2 to 5 milliseconds. Two successive touches merge if they occur within about 10 milliseconds of one another, while flashes of light blur together if they are separated by less than about 20 to 30 milliseconds.
  • I find all of this stuff very interesting. =) thanks for sharing your story, @5pinlink. That’s so crazy you were born deaf! fwiw I don’t personally care about this slight offset in bm3 or any other music software, in terms of whether or not it should be changed at all. Human timing can make things more interesting, but I also respect and appreciate when timings are super tight. 

    That said, sharing a project file and/or videos demonstrating an issue like this will help if there’s actually anything that can be fixed. I haven’t personally experienced it as a problem, but I also haven’t been able to spend much time making music the past couple months.
  • edited December 2017
    Like I say, the only way to help fix this is to have a secondary independent recording @ronji.
    its the only way to have a guide stick, the recording of the event data in Beatmaker and a straight live audio recording from a completely separate device, of what it should have recorded.
  • I agree with @5pinlink, but I think you could might get ‘lucky’ by putting a programmed click at the start of a preformance for syncronisation purpouses. That way you could detect if there is a consistant delay. Of course it does not prove there’s nothing wrong if you find no difference, because of the aforementioned arguments. It only proves something if you hear a problem.
Sign In or Register to comment.