Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Annotation cost for 1‰ label. For a dataset with dense points #1

Open
Cheater-Hater opened this issue Apr 22, 2021 · 1 comment
Open

Comments

@Cheater-Hater
Copy link

Thanks for your awesome work!
I'd like to ask about the annotation cost of your work

For example, one of the rooms in S3DIS has over 9,000,000 points.
Does that mean that you're going to annotate 9000 points for that scene?

Point clouds with that many points are not because it is large, it's just because that the points are really dense.

I'd like to know can your method still work when a dataset's point cloud is not that dense?

@PointCloudYC
Copy link

PointCloudYC commented Oct 26, 2021

For example, one of the rooms in S3DIS has over 9,000,000 points.
Does that mean that you're going to annotate 9000 points for that scene?

Yes, it is true. But you need to annotate this 9000 ( 9,000,000*0.001) points over the sub-sampled point cloud obtained through grid down-sampling.

You might refer to our unofficial TensorFlow re-implementation SQN_tensorflow.

As for the dense issue, I think it is dataset-dependent. The SQN paper had been benchmarked on 7 datasets w. different characteristics, some of them might be not that dense, e.g., Toronto3D and DALES ( acquired through aerial/mobile laser scanners) should be less dense than S3DIS (through terrestrial laser scanners).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants