Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Ercan et al., CVPRW 2023 #218

Merged
merged 1 commit into from
Jun 8, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,7 +273,9 @@ Mach. Intell. Res., 19:412-424, 2022.
- <a name="Gehrig22arxiv"></a>Gehrig, D., Scaramuzza, D.,
*[Are High-Resolution Cameras Really Needed?](https://arxiv.org/abs/2203.14672)*,
arXiv, 2022. [YouTube](https://youtu.be/HV9_FhS-f88), [code](https://uzh-rpg.github.io/eres/).

- <a name="Ercan23cvprw"></a>Ercan, B., Eker, O., Erdem, A., Erdem, E.
*[EVREAL: Towards a Comprehensive Benchmark and Analysis Suite for Event-based Video Reconstruction](https://openaccess.thecvf.com/content/CVPR2023W/EventVision/papers/Ercan_EVREAL_Towards_a_Comprehensive_Benchmark_and_Analysis_Suite_for_Event-Based_CVPRW_2023_paper.pdf)*,
IEEE Conf. Computer Vision and Pattern Recognition Workshops (CVPRW), 2023. [PDF](https://openaccess.thecvf.com/content/CVPR2023W/EventVision/papers/Ercan_EVREAL_Towards_a_Comprehensive_Benchmark_and_Analysis_Suite_for_Event-Based_CVPRW_2023_paper.pdf), [Project Page](https://ercanburak.github.io/evreal.html), [Suppl.](https://openaccess.thecvf.com/content/CVPR2023W/EventVision/supplemental/Ercan_EVREAL_Towards_a_CVPRW_2023_supplemental.zip), [Code](https://github.com/ercanburak/EVREAL).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. BTW, [33] has open-source code :) https://github.com/tub-rip/event_based_image_rec_inverse_problem
which is omitted in Table 1.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, thank you very much for bringing that comment here :) Let me respond and hopefully clarify that now:

In Table 1, what we indicate with the "open source" column is whether that work has its "evaluation" code/setup open, meaning that one can run a script and generate the quantitative results in that work, similar to this and that. I think being able to reproduce quantitative results is important since it is not feasible for one to confirm that the code worked as expected just by looking at the qualitative results. I believe some years-old issues on code repositories of earlier works (examples: 1, 2, 3) indicate that this is indeed something needed in the community.

Given that clarification, please let us know if we are still missing something in Table 1, particularly related to [33] :)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the clarification. I think it is a matter of my expectations: when I read "Open Source" in the column header and there is one method/reference per row, I thought it was about listing the references that had the code available vs those that did not. For example, the first row [27] has no code available. Maybe a header "Eval. code available" would have been better. No worries.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you very much for your comments, you are right, it could have been explained better (if we weren't trying so hard to fit into the page limit).

Looking forward to the workshop!


<a name="algorithms"></a>
# Algorithms
Expand Down Expand Up @@ -2665,6 +2667,7 @@ Arxiv, 2023.
- [DVS Reconstruction code](https://github.com/VLOGroup/dvs-reconstruction) associated to the paper [Reinbacher et al., BMVC 2016](#Reinbacher16bmvc).
- [High-pass filter code](https://github.com/cedric-scheerlinck/dvs_image_reconstruction) associated to the paper [Scheerlinck et al., ACCV 2018](#Scheerlinck18accv)
- [E2VID code](https://github.com/uzh-rpg/rpg_e2vid) associated to the paper [Rebecq et al., TPAMI 2020](#Rebecq20tpami).
- [EVREAL code](https://github.com/ercanburak/EVREAL) associated to the paper [Ercan et al., CVPRW 2023](#Ercan23cvprw)
- **Localization and Ego-Motion Estimation**
- [Panoramic tracking code](https://github.com/VLOGroup/dvs-panotracking) associated to the paper [Reinbacher et al., ICCP 2017](#Reinbacher17iccp).
- **Pattern Recognition**
Expand Down Expand Up @@ -2977,4 +2980,4 @@ MSc. Thesis, Université Laval, Canada, 2022.
<a name="contributing"></a>
# Contributing
Please see [CONTRIBUTING](https://github.com/uzh-rpg/event-based_vision_resources/blob/master/Contributing.md) for details.
***
***