Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation code #67

Open
xianyongqin opened this issue Nov 19, 2020 · 2 comments
Open

Evaluation code #67

xianyongqin opened this issue Nov 19, 2020 · 2 comments

Comments

@xianyongqin
Copy link

xianyongqin commented Nov 19, 2020

Thanks for releasing the amazing codes. Is it possible to release the evaluation codes that reproduce the Chamfer distance (GenRe, 0.106) reported in Table 1 of the GenRe paper? I followed the setting described in the paper but got a much worse Chamfer distance (GenRe, 0.30) with the released pre-trained model.

@yuchenrao
Copy link

yuchenrao commented Apr 21, 2021

Hi @xianyongqin, I also have worse evaluation by using their evaluation code from pix3d. For Lamp, the paper has 0.124 but mine is 0.297. Did you figure out why this happen? Thanks a lot!

@ztzhang
Copy link
Collaborator

ztzhang commented Apr 23, 2021

referencing to issue #73.
For CD discrepancies, several things might be helpful for tracing the issue:

  1. the gt voxel should be surface only.
  2. the prediction should at least match the given mask, at the test viewing angle.
  3. for results in table 1, we searched for different thresholding values, usually from 0.3 to 0.5 with a 0.05 step size, and this is done for each class. Though this is not ideal, I think we did this since previous baselines reported their numbers in a similar fashion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants