Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SelfConsistentHubbardWorkChain: Relabeling of HubbardStructure if relaxations are skipped #63

Merged
merged 3 commits into from
Apr 11, 2024

Conversation

t-reents
Copy link
Collaborator

Fixes #61.

@bastonero I moved the relabeling to a separate method, as discussed. I decided to introduce a new method in SelfConsistentHubbardWorkChain, instead of a new utility function in a separate module, to directly access the context of the WorkChain. The relabeling logic is the same as before and the method takes also care of generating an appropriate message to be passed to self.report.

In this new version, the HubbardStructure would also be relabeled if meta_convergence == False. If you would like to stick to the current logic, in the sense, that it's not updated, I could simply add an if-statement to the method. This is up to you.

The name of the method is of course debatable as well. I wasn't a 100% happy with the current name, as it also checks whether an update is necessary and might not relabel the structure. However, something like should_relabel... would indicate that a bool is returned. Moreover, update_hubbard... seemed a bit too general.

Tests have been adjusted as well and I added another one, to check if the correct reporting messages are provided.

The `current_hubbard_structure` is not relabeled in `SelfConsistentHubbardWorkChain` if `skip_relax_iterations != 0`.
This can cause problems in the subsequent `PwRelaxWorkChains` if new types are introduced.
Therefore, the relabeling step is moved to a separate method that is called in any case.
Copy link
Collaborator

@bastonero bastonero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot @t-reents ! As a general feedback, I would try to have methods that do specifc thing, and not to mix e.g. checks multiple times. On the other hand, I recognize that it's a bit hard to devise something smart enough, or at least with the current design. For instance, the should_check_convergence emits report messages, but it's called twice, so not the best solution. I think we can live with that, and also it's not said in general that if the meta_convergence is switched on, you would immediately restart a new cycle. With the new HubbardStructureData, actually we can still compare parameters if they have the same length but different kinds.

src/aiida_quantumespresso_hp/workflows/hubbard.py Outdated Show resolved Hide resolved
src/aiida_quantumespresso_hp/workflows/hubbard.py Outdated Show resolved Hide resolved
src/aiida_quantumespresso_hp/workflows/hubbard.py Outdated Show resolved Hide resolved
src/aiida_quantumespresso_hp/workflows/hubbard.py Outdated Show resolved Hide resolved
src/aiida_quantumespresso_hp/workflows/hubbard.py Outdated Show resolved Hide resolved
src/aiida_quantumespresso_hp/workflows/hubbard.py Outdated Show resolved Hide resolved
@t-reents t-reents force-pushed the fix/61_relabeling_when_skip_relax branch from 214f499 to 1bcc183 Compare February 13, 2024 11:01
@t-reents
Copy link
Collaborator Author

Thanks for your comments @bastonero! I agree that this simplified logic makes more sense and that the messages are probably not necessary. I only introduced this logic to keep the previous reports etc., and to not mess things up too much. But as you don't think they are necessary as well, I'm more than happy to remove them.

@bastonero
Copy link
Collaborator

Hi @t-reents , did you use this feature lately? Does everything work? If yes, I guess we could merge this then.

@t-reents
Copy link
Collaborator Author

Hi @bastonero, I used it after I implemented it and everything seems to work, so I think it can be merged

@bastonero bastonero merged commit 4c8036e into master Apr 11, 2024
6 checks passed
@bastonero
Copy link
Collaborator

Thanks a lot @t-reents !

@bastonero bastonero deleted the fix/61_relabeling_when_skip_relax branch April 11, 2024 15:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

SelfConsistentHubbardWorkChain: Missing relabeling when skipping relaxation
2 participants