Latency between playback and record based on VST Performer buffer size?

I’m scratching my head about this the entire day yesterday and further on.
We have an internal test procedure where the Performer “simply” bounces incoming buffered and timestamped audio unmodified. That test succeeds with sample accuracy.
Audio devices do report their latency. So we adjust the timestamps accordingly; it takes output latency samples until the signal arrives at the speaker/headphones, you perform to that, and it takes input latency samples for the input signal to be available at that point in time. There was a bug in that, and we fixed it, yet I also still get strange results.
A/D and D/A latencies may indeed cause shift of a few ms if the audio driver doesn’t take those into account. However that would be a constant offset that would not change with buffersize.
Your numbers indicate that that somehow doesn’t work. If you subtract two times the buffersize (which is usually what i/o latency is) in each case, you’ll end up with a pretty consistent shift of around 224 samples. This makes me wonder if you are indeed using version 4.0.44, could you pls double-check?
But then I can somehow reproduce it, need to check other audio interfaces. So, hold the line, we’ll figure it out :slight_smile: