Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Small fixes to the basic tutorial #50

Merged
merged 2 commits into from
Nov 23, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 27 additions & 13 deletions docs/source/1_computing_hubbard.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
"* __DFPT calculation__: use the {py:class}`~aiida_quantumespresso_hp.workflow.hp.base.HpBaseWorkChain` to do a self-consistent perturbation calculation to predict the Hubbard parameters.\n",
"\n",
"In this tutorial we will make use of the silicon structure to give you an overall understanding of the usage of the package.\n",
"If you are interested in more advanced features, please have a look at the [next tutorial](./2_parallel_hubbard.ipynb) or to the [how tos](../howto/index.rst).\n",
"If you are interested in more advanced features, please have a look at the [next tutorial](./2_parallel_hubbard.ipynb) or to the [how tos](howto).\n",
"\n",
"Let's get started!"
]
Expand Down Expand Up @@ -141,10 +141,14 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"As you can see, the desired interactions has been initialized correctly. This is important because ``hp.x`` needs to know which atoms need to be perturbed. As you will see later, the ``hp.x`` will take care of adding the remaining interactions with neighbouring atoms.\n",
"As you can see, the desired interactions have been initialized correctly. \n",
mbercx marked this conversation as resolved.
Show resolved Hide resolved
"This is important because `hp.x` needs to know which atoms need to be perturbed. \n",
"As you will see later, `hp.x` will take care of adding the remaining interactions with neighbouring atoms.\n",
"\n",
":::{important}\n",
"When you will use your own structures, make sure to have your 'Hubbard atoms' first in the list of atoms. This is due to the way the ``hp.x`` routine works internally, requiring those to be first. You can simply do this with the following snippet (IF THE NODE IS YET NOT STORED!):\n",
"When you use your own structures, make sure to have your 'Hubbard atoms' first in the list of atoms.\n",
"This is due to the way the `hp.x` routine works internally, requiring those to be first.\n",
"You can simply do this with the following snippet (IF THE NODE IS YET NOT STORED!):\n",
"\n",
"```python\n",
"from aiida_quantumespresso.utils.hubbard import HubbardUtils\n",
Expand All @@ -161,7 +165,7 @@
"## Calculating the SCF ground-state\n",
"\n",
"Now that we have defined the structure, we can calculate its ground-state via an SCF using the `PwBaseWorkChain`.\n",
"We can fill the inputs of the builder of the PwBaseWorkChain through the `get_builder_from_protocol`."
"We can fill the inputs of the builder of the `PwBaseWorkChain` through the `get_builder_from_protocol()` method."
]
},
{
Expand Down Expand Up @@ -197,7 +201,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"As you can notice from the results, the workchain (actually, the `PwCalculation`!) has a `remote_folder` output namespace. This is what we need in order to run the `HpBaseWorkChain`. "
"As you can notice from the results, the workchain (actually, the `PwCalculation`!) has a `remote_folder` output.\n",
"This is what we need in order to run the `HpBaseWorkChain`. "
]
},
{
Expand All @@ -208,7 +213,7 @@
"## DFPT calculation of Hubbard parameters\n",
"\n",
"We can perturb the ground-state previously found to compute the Hubbard parameters.\n",
"Here we will need to use the `HpBaseWorkChain`, and link the `parent folder` previously produced."
"Here we will need to use the `HpBaseWorkChain`, and link the `remote_folder` previously produced via the `parent_scf` input."
]
},
{
Expand Down Expand Up @@ -286,12 +291,15 @@
"source": [
"## Final considerations\n",
"\n",
"We managed to compute the Hubbard parameters __fully__ ___ab initio___! :tada:\n",
"Although, as you could have noticed, there were some quite few passages to do by end. Moreover, there are the following considerations:\n",
"\n",
"1. For larger and more complex structures you will need to perturb many more atoms. Moreover, to get converged results you will need more the one q points. Clieck [here](./2_parallel_hubbard.ipynb). to learn how to parallelize over atoms and q points\n",
"2. To do a _full_ self-consistent calculation of these parameters, you should _relax_ your structure with the Hubbard parameters from the ``hp.x`` run, repeat the steps of this tutorial, relax _again_, and do this procedure over and over till convergence. Learn the automated way [here](./3_self_consistent.ipynb)!\n",
"We managed to compute the Hubbard parameters of LiCoO2 __fully__ ___ab initio___! :tada:\n",
"However, we had to execute quite a few steps manually, which can be tedious and error prone.\n",
"Moreover, there are the following considerations:\n",
"\n",
"1. For larger and more complex structures you will need to perturb many more atoms.\n",
" Moreover, to get converged results you will need more than one q point.\n",
" Click [here](./2_parallel_hubbard.ipynb) to learn how to parallelize over atoms and q points.\n",
"2. To do a _full_ self-consistent calculation of these parameters, you should _relax_ your structure with the Hubbard parameters from the `hp.x` run, repeat the steps of this tutorial, relax _again_, and do this procedure over and over till convergence.\n",
" Learn the automated way [here](./3_self_consistent.ipynb)!\n",
"\n",
":::{admonition} Learn more and in details\n",
":class: hint\n",
Expand All @@ -302,9 +310,15 @@
":::\n",
"\n",
":::{note}\n",
"We suggest to proceed first with the tutorial for point (1) and then the one for point (2). Nevertheless, tutorial (1) is not strictly necessary for (1).\n",
"We suggest to proceed first with the tutorial for point (1) and then the one for point (2). \n",
"Nevertheless, tutorial (1) is not strictly necessary for (2).\n",
":::"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": []
}
],
"metadata": {
Expand All @@ -323,7 +337,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.10"
"version": "3.10.13"
},
"orig_nbformat": 4,
"vscode": {
Expand Down
2 changes: 1 addition & 1 deletion docs/source/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,7 @@ To the reference guides

If you use this plugin for your research, please cite the following work:

> Lorenzo Bastonero, Cristiano Malica, Marnik Bercx, Eric Macke, Iurii Timrov, Nicola Marzari, and Sebastiaan P. Huber, [*Automated self-consistent prediction of extended Hubbard parameters for Li-ion batteries*](), npj Comp. Mat., **?**, ? (2023)
> Lorenzo Bastonero, Cristiano Malica, Marnik Bercx, Eric Macke, Iurii Timrov, Nicola Marzari, and Sebastiaan P. Huber, [*Automated self-consistent prediction of extended Hubbard parameters for Li-ion batteries*](https://media.giphy.com/media/zyclIRxMwlY40/giphy.gif), npj Comp. Mat., **?**, ? (2023)

> Sebastiaan. P. Huber, Spyros Zoupanos, Martin Uhrin, Leopold Talirz, Leonid Kahle, Rico Häuselmann, Dominik Gresch, Tiziano Müller, Aliaksandr V. Yakutovich, Casper W. Andersen, Francisco F. Ramirez, Carl S. Adorf, Fernando Gargiulo, Snehal Kumbhar, Elsa Passaro, Conrad Johnston, Andrius Merkys, Andrea Cepellotti, Nicolas Mounet, Nicola Marzari, Boris Kozinsky, and Giovanni Pizzi, [*AiiDA 1.0, a scalable computational infrastructure for automated reproducible workflows and data provenance*](https://doi.org/10.1038/s41597-020-00638-4), Scientific Data **7**, 300 (2020)

Expand Down
6 changes: 3 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -37,15 +37,15 @@ Documentation = 'https://aiida-quantumespresso-hp.readthedocs.io'

[project.optional-dependencies]
docs = [
'myst-nb~=0.17',
'myst-nb~=1.0',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔥

'jupytext>=1.11.2,<1.15.0',
'sphinx-togglebutton',
'sphinx~=5.2',
'sphinx~=6.2',
'sphinx-copybutton~=0.5.2',
'sphinx-book-theme~=1.0.1',
'sphinx-design~=0.4.1',
'sphinxcontrib-details-directive~=0.1.0',
'sphinx-autoapi~=2.0.1',
'sphinx-autoapi~=3.0',
]
pre-commit = [
'pre-commit~=2.17',
Expand Down
Loading