Describe the bug
This is a very minor thing I guess, but to my understanding the effect of buffering is not shown correcly in the Jamulus main window / not calculated correctly in
CClient::EstimatedOverallDelay. It uses a factor of 0.7 for local and remote buffers - with the comment that the buffers usually a bit larger than required. That may be true, and the achievable delay would be less if the buffers were set correctly.
But I aussume that the display of delay should show the delay currently experienced, not the delay potentially achievable with better buffer setting.
Or do I miss something?
To Reproduce
No other measurement is available for the real experienced delay, so the value shown may just be off / too low compared to real experienced delay.
Expected behavior
My understanding of the buffer implementation is that it evens out network delay and jitter, and adds a fixed delay, so the delay of e.g. a buffer size of 4 is 4 times the block duration.
so the code should read
const float fTotalJitterBufferDelayMs = fSystemBlockDurationMs * ( GetSockBufNumFrames() + GetServerSockBufNumFrames() );
instead of
const float fTotalJitterBufferDelayMs = fSystemBlockDurationMs * ( GetSockBufNumFrames() + GetServerSockBufNumFrames() ) * 0.7f;
Operating system
any
Version of Jamulus
3.9.1
Additional context
I am prototyping a statisitics console on connection quality that should help to monitor long time quality of connections to the server. So I read a lot of jamulus source code and try to figure out the statistics calculations currently used. This when I encountered this calculation that I do not understand.
Describe the bug
This is a very minor thing I guess, but to my understanding the effect of buffering is not shown correcly in the Jamulus main window / not calculated correctly in
CClient::EstimatedOverallDelay. It uses a factor of 0.7 for local and remote buffers - with the comment that the buffers usually a bit larger than required. That may be true, and the achievable delay would be less if the buffers were set correctly.
But I aussume that the display of delay should show the delay currently experienced, not the delay potentially achievable with better buffer setting.
Or do I miss something?
To Reproduce
No other measurement is available for the real experienced delay, so the value shown may just be off / too low compared to real experienced delay.
Expected behavior
My understanding of the buffer implementation is that it evens out network delay and jitter, and adds a fixed delay, so the delay of e.g. a buffer size of 4 is 4 times the block duration.
so the code should read
const float fTotalJitterBufferDelayMs = fSystemBlockDurationMs * ( GetSockBufNumFrames() + GetServerSockBufNumFrames() );instead of
const float fTotalJitterBufferDelayMs = fSystemBlockDurationMs * ( GetSockBufNumFrames() + GetServerSockBufNumFrames() ) * 0.7f;Operating system
any
Version of Jamulus
3.9.1
Additional context
I am prototyping a statisitics console on connection quality that should help to monitor long time quality of connections to the server. So I read a lot of jamulus source code and try to figure out the statistics calculations currently used. This when I encountered this calculation that I do not understand.