Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adjustment for anomaly detection #24

Open
amueller opened this issue Jun 11, 2024 · 2 comments
Open

adjustment for anomaly detection #24

amueller opened this issue Jun 11, 2024 · 2 comments

Comments

@amueller
Copy link

Hey!
Thanks for making the code available. I was wondering about the adjustment made for the anomaly detection. Is this done for all the competing methods? And is there some documentation for the datasets that explains that part of the evaluation?

@gasvn
Copy link
Member

gasvn commented Jun 12, 2024

For anomaly detection, we follow the https://github.com/thuml/Time-Series-Library We use the same strategy for all methods. You can refer to their repo for more details about the evalution.

@amueller
Copy link
Author

amueller commented Jul 2, 2024

Thank you. That's interesting, it seems to be common in the deep learning literature but not in the previous time series anomaly literature. For example the VUS paper https://www.paparrizos.org/papers/PaparrizosVLDB22b.pdf and the "precision and recall for time series" paper don't consider this version, right? https://proceedings.neurips.cc/paper_files/paper/2018/file/8f468c873a32bb0619eaeb2050ba45d1-Paper.pdf

It makes sense that you follow the competing methods, but is there a reason not to use the published and studied metrics?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants