Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About train datasets #63

Open
Madeline-hyh opened this issue Jul 17, 2024 · 7 comments
Open

About train datasets #63

Madeline-hyh opened this issue Jul 17, 2024 · 7 comments

Comments

@Madeline-hyh
Copy link

Great work, I'm very interested in it. But I have a small question, I want to know what is the training dataset used for the wild-daclip_ViT-L-14.pt model, and do all the training datasets contain the four mixed degradations (blur, noise, resize, jpeg)? Is the wild degradation "blur, noise, resize, jpeg"? We look forward to your reply and thank you.

@Algolzw
Copy link
Owner

Algolzw commented Jul 21, 2024

Thank you! We use the LSDIR dataset to train the model. For both daclip_ViT-L-14 and Wild-IR, all training LQ images are generated using the random_degradation function.

@Madeline-hyh
Copy link
Author

Thank you very much for your patient reply. After reading the code file you gave, I found that your deg_list does not have resize (it is replaced by blur), so it is convenient to know why? Thank you.

@Algolzw
Copy link
Owner

Algolzw commented Jul 22, 2024

Great! I prefer 'blur' since most 'resize' operations use bicubic/bilinear/nearest interpolation.

@Lincoln20030413
Copy link

插值会影响什么呢?这个操作不是之前的图像复原模型中造超分数据集时都会用的吗?期待您的解答

@Algolzw
Copy link
Owner

Algolzw commented Jul 27, 2024

插值也可以看作是一种blur操作,在超分中一般假定是先blur再下采样,但这里我们跳过了下采样这一步(因为这个模型不支持超分任务)。

@Lincoln20030413
Copy link

感谢回答,那你们不用resize操作而用blur是因为resize本身的插值带来了一些问题吗?

@Algolzw
Copy link
Owner

Algolzw commented Aug 7, 2024

我也不确定= =,不过有时候resize应该也是可以的,一般看任务而定。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants