-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
op_read_float jumps to random offsets when called with small buf_size #18
Comments
We weren't able to see the issue following these steps. Could you please provide full code which reproduces the issue? You might also try your current setup against git master in case that makes a difference. |
I hope that I'm not being disruptive by bumping such an old issue, but I think it's worth noting that you have to adjust the pointer to the buffer to account for the number of samples that have already been read before performing step 5, otherwise the samples at the start of the buffer would be overwritten. I wonder if the author of this forgot to do so, especially as their example output sounds like the result of that potential mistake. |
Using opusfile 0.10.0, when calling op_read_float with a small buffer the function suddenly reads data from a totally different part of the stream.
This results in awkward noises and overall wrong output data.
Have a listen to this file from mozilla: detodos.zip (downloaded from here)
And now listen to the same file processed by opusfile: out2.zip
What I did was very simple.
I try to read data in packets of 5760 samples like so:
remaining = 5760 - numOfSamplesRead
)remaining
is0
, return the buffer and end the loopremaining
samples using op_read_floatIt appears that whenever remaining has a very small size (such as 312 samples), the data is read from a completely different part of the file.
This is supported by the fact that it happens in the same intervals and that it occurs less frequently with larger buffer sizes.
Haven't tried the same for op_read.
The text was updated successfully, but these errors were encountered: