Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement the Largest Triangle Three Buckets (LTTB) Algorithm #2

Open
Lucas-Mc opened this issue Dec 10, 2020 · 3 comments
Open

Implement the Largest Triangle Three Buckets (LTTB) Algorithm #2

Lucas-Mc opened this issue Dec 10, 2020 · 3 comments
Labels
enhancement New feature or request

Comments

@Lucas-Mc
Copy link
Collaborator

An intelligent algorithm to decimate data without losing much information (signal shape / local extrema); would be very helpful towards rendering large datasets as long as it's faster than the performance gain 😆

https://skemman.is/bitstream/1946/15343/3/SS_MSthesis.pdf

@Lucas-Mc
Copy link
Collaborator Author

Preliminary results show significant speed reduction using the LTTB algorithm. Is the improvement in preserving signal quality worth it? Maybe balance this by using it only when attempting to render large arrays, though this would also reduce the speed of the LTTB algorithm.

LTTB: 0.25301408767700195
array[::down_sample]: 0.0007829666137695312

@Lucas-Mc
Copy link
Collaborator Author

Maybe I could downsample all the signals beforehand using this approach then create new WFDB formatted signals with much lower total number of samples?

@Lucas-Mc
Copy link
Collaborator Author

An example of the algorithm in action:
image

@Lucas-Mc Lucas-Mc added the enhancement New feature or request label May 11, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant