diff --git a/.github/workflows/unit_test.yml b/.github/workflows/unit_test.yml index d7ac03cf..cd7e5ffc 100644 --- a/.github/workflows/unit_test.yml +++ b/.github/workflows/unit_test.yml @@ -122,10 +122,10 @@ jobs: needs: test-linux steps: - - uses: actions/checkout@v3 + - uses: actions/checkout@v4 - uses: conda-incubator/setup-miniconda@v3 with: - python-version: "3.10" + python-version: "3.11" mamba-version: "*" channels: conda-forge,defaults channel-priority: true @@ -140,9 +140,11 @@ jobs: python -m sparc.download_data - name: Download SPARC output files to SPARC-master run: | - # TODO: merge to master - wget -O SPARC-socket.zip https://codeload.github.com/alchem0x2A/SPARC/zip/refs/heads/socket - unzip SPARC-socket.zip + # Use latest SPARC public code with socket support + HASH=99e4b7e94ca6f7b4ca1dde9135bea4075b0678f4 + wget -O SPARC-socket.zip https://codeload.github.com/SPARC-X/SPARC/zip/$HASH + unzip SPARC-socket.zip && rm -rf SPARC-socket.zip + mv SPARC-$HASH SPARC-socket - name: Compile SPARC with socket run: | cd SPARC-socket/src diff --git a/README.md b/README.md index 9d146bf9..bf94d06a 100644 --- a/README.md +++ b/README.md @@ -60,6 +60,22 @@ Install the pre-compiled SPARC binary alongside SPARC-X-API (Linux only). conda install -c conda-forge sparc-x ``` +### Setup SPARC-X-API + +Preferences for SPARC-X-API and SPARC C/C++ code can be defined in ASE [configuration file](https://wiki.fysik.dtu.dk/ase/ase/calculators/calculators.html#calculator-configuration), located at `~/.config/ase/config.ini`, such as following example: + +```ini +[sparc] +; `command`: full shell command (include MPI directives) to run SPARC +command = srun -n 24 path/to/sparc + +; `psp_path`: directory containing pseudopotential files (optional) +psp_path = path/to/SPARC/psps + +; `doc_path`: directory for SPARC LaTeX documentation to build JSON schema on the fly (optional) +doc_path = path/to/SPARC/doc/.LaTeX/ +``` + ### Reading / Writing SPARC files SPARC-X-API provides a file format `sparc` compatible with the ASE @@ -68,12 +84,14 @@ in-/output files as a bundle: - Read from a SPARC bundle ```python +# `format="sparc"` should be specified from ase.io import read atoms = read("sparc_calc_dir/", format="sparc") ``` - Write input files ```python +# `format="sparc"` should be specified from ase.build import Bulk atoms = Bulk("Al") * [4, 4, 4] atoms.write("sparc_calc_dir/", format="sparc") diff --git a/doc/advanced_socket.md b/doc/advanced_socket.md index 0e6eb227..02ac0114 100644 --- a/doc/advanced_socket.md +++ b/doc/advanced_socket.md @@ -21,10 +21,14 @@ Fig. 1. SPARC electronic calculations with socket communication across hybrid co ``` **Requirements**: the SPARC binary must be manually compiled from the source -code with [socket -support](https://github.com/alchem0x2A/SPARC/tree/socket) and with the +code with socket +support and with the `USE_SOCKET=1` flag enabled (see the [installation -instructions](https://github.com/alchem0x2A/SPARC/tree/socket). +instructions](https://github.com/SPARC-X/SPARC?tab=readme-ov-file#2-installation). + +```{note} +You need SPARC C/C++ after version 2024.11.18 to enable socket support. +``` ## Usage The socket communication layer in SPARC and SPARC-X-API are designed for: @@ -38,11 +42,11 @@ standard. Specifically, we implement the original i-PI protocol within the SPARC C-source code, while the python SPARC-X-API uses a backward-compatible protocol based on i-PI. The dual-mode design is aimed for both low-level and high-level interfacing of the DFT codes, -providing the following features as shown in [Fig. 2](#SPARC-protocol-overview): +providing the following features as shown in [Fig. 2](#scheme-sparc-protocol): -(scheme-sparc-protocol)= ```{figure} img/scheme_sparc_protocol.png :alt: scheme-sparc-protocol +:name: scheme-sparc-protocol Fig. 2. Overview of the SPARC protocol as an extension to the standard i-PI protocol. ``` @@ -51,8 +55,8 @@ Based on the scenarios, the socket communication layer can be accessed via the following approaches as shown in [Fig. 3](#scheme-sparc-modes): -(scheme-sparc-modes)= ```{figure} img/scheme-SPARC-socket-modes.png +:name: scheme-sparc-modes :alt: scheme-sparc-modes Fig. 3. Different ways of using SPARC's socket mode. @@ -81,7 +85,7 @@ Fig. 3. Different ways of using SPARC's socket mode. to be run on a single computer system. -2. **Local-only Mode** ([Fig. 3](#scheme-sparc-modes) **b**) +2. **Local-only mode** ([Fig. 3](#scheme-sparc-modes) **b**) Ideal for standalone calculations, this mode simulates a conventional calculator while benefiting from socket-based efficiency. @@ -91,7 +95,7 @@ Fig. 3. Different ways of using SPARC's socket mode. ``` For most users we recommend using this mode when performing a calculation on a single HPC node. -3. **Client (Relay) Mode** ([Fig. 3](#scheme-sparc-modes) **c**) +3. **Client (Relay) mode** ([Fig. 3](#scheme-sparc-modes) **c**) In this mode, the `sparc.SPARC` calculator servers as a passive client which listens to a remote i-PI-compatible server. When @@ -119,7 +123,7 @@ Fig. 3. Different ways of using SPARC's socket mode. automatically determine if it is necessary to restart the SPARC subprocess. -4. **Server Mode** ([Fig. 3](#scheme-sparc-modes) **d**) +4. **Server mode** ([Fig. 3](#scheme-sparc-modes) **d**) Paired with the client mode in (3), SPARC-X-API can be run as a socket server, isolated from the node that performs the diff --git a/doc/advanced_topics.md b/doc/advanced_topics.md index 4e72edfd..1d778468 100644 --- a/doc/advanced_topics.md +++ b/doc/advanced_topics.md @@ -93,7 +93,7 @@ old_atoms.get_potential_energy() ``` - +(rule-input-param)= ## Rules for input parameters in `sparc.SPARC` calculator When constructing the `sparc.SPARC` calculator using the syntax diff --git a/doc/basic_usage.md b/doc/basic_usage.md index 39ff409f..6b560234 100644 --- a/doc/basic_usage.md +++ b/doc/basic_usage.md @@ -13,7 +13,7 @@ file format: ```python import sparc from ase.io import read, write -atoms = read("test.sparc", index=-1) +atoms = read("test.sparc", index=-1, format="sparc") ``` *Note*: To read multiple output files from the same directory, e.g., SPARC.aimd, SPARC.aimd\_01, pass the keyword argument `include_all_files=True` to `read()` @@ -24,7 +24,11 @@ import sparc from ase.io import read, write from ase.build import Bulk atoms = Bulk("Al") * [4, 4, 4] -atoms.write("test.sparc") +atoms.write("test.sparc", format="sparc") +``` + +```{note} +You need to specify `format="sparc"` when using the `read` and `write` functions from `ase.io`, as automatic file extension detection doesn't work for directories. ``` For a deeper dive into the bundle I/O format, see [Advanced Topics](advanced_topics.md). @@ -109,7 +113,7 @@ If you want to extract more information about the MD simulation steps, take a lo 4. Geometric optimization using ASE's optimizers -The power of `SPARC-X-API` is to combine single point `SPARC` calculations with advanced ASE optimizers, such as BFGS, FIRE or GPMin. Example 2 can be re-written as: +The power of `SPARC-X-API` is to combine single point `SPARC` calculations with advanced ASE optimizers, such as `BFGS`, `FIRE` or `GPMin`. Example 2 can be re-written as: ```python from sparc.calculator import SPARC @@ -133,11 +137,14 @@ for the visualization of atomistic structures. Depending on the bundle's contents, this could display individual atoms or multiple images. -[Fig. 2](#fig-2-a-screenshot-of-the-sparc-ase-program) is a screenshot showing the usage of `sparc-ase gui` to visualize a -short [MD trajectory](tests/outputs/NH3_sort_lbfgs_opt.sparc). +[Fig. 1](#fig-screenshot-sparc-ase) is a screenshot showing the usage of `sparc-ase gui` to visualize a +short MD trajectory. -#### Fig 2. A screenshot of the `sparc-ase` program -image +(fig-screenshot-sparc-ase)= +```{figure} https://github.com/alchem0x2A/SPARC-X-API/assets/6829706/e72329ff-7194-4819-94f8-486ef2218844 + +Fig 1. A screenshot of the `sparc-ase` program +``` ### Parameters and units used in `SPARC-X-API` @@ -149,7 +156,7 @@ atoms.calc = SPARC(h=0.25, REFERENCE_CUTOFF=0.5, EXX_RANGE_PBE=0.16, **params) ``` inputs following ASE's convention (e.g., `h`) adopt eV/Angstrom units (thus the same setting can be applied to other DFT calculators), On the other hand, all SPARC-specific parameters, which can often be recognized by their capitalized format (like `REFERENCE_CUTOFF`, `EXX_RANGE_PBE`), retain their original values consistent with their representation in the `.inpt` files. -The reasoning and details about unit conversion can be found in the [Rules for Input Parameters](https://github.com/alchem0x2A/SPARC-X-API/blob/master/doc/advanced_topics.md#rules-for-input-parameters-in-sparcsparc-calculator) in Advanced Topics. +The reasoning and details about unit conversion can be found in the [Rules for Input Parameters](#rule-input-param) in Advanced Topics. In order for `SPARC-X-API` to be compatible with other ASE-based DFT calculators, diff --git a/doc/contribute.md b/doc/contribute.md index a77ece43..76230940 100644 --- a/doc/contribute.md +++ b/doc/contribute.md @@ -51,7 +51,7 @@ for pre-commit hooks used in this project, and change them if needed. If you need to test running DFT using the API, compile or install the `sparc` executables following the [manual](https://github.com/SPARC-X/SPARC/blob/master/README.md). Check -[some examples](#install-the-sparc-binary-code) for our recommended +[some examples](#install-binary) for our recommended approaches. @@ -109,7 +109,7 @@ flavor](https://myst-parser.readthedocs.io/en/latest/) of Markdown. The source `.md` files, together with the main `README.md` are then rendered to html files using `sphinx`. -After [setting up the test environment](#setting-up-environment), +After [setting up the test environment](setup_environment.md), additional packages for doc rendering can be installed via: ```bash cd SPARC-X-API @@ -145,55 +145,7 @@ computating power (e.g. a few minutes with 4 CPU cores). Please check the [maintenance guide](maintainers.md) for roles involving admin privileges. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +```{toctree} +:max_depth: 1 +test_coverage.md +``` diff --git a/doc/introduction.md b/doc/introduction.md deleted file mode 100644 index e10b99d0..00000000 --- a/doc/introduction.md +++ /dev/null @@ -1 +0,0 @@ -# Introduction diff --git a/doc/setup_environment.md b/doc/setup_environment.md index de98027e..c176ead7 100644 --- a/doc/setup_environment.md +++ b/doc/setup_environment.md @@ -18,10 +18,57 @@ configurations. The default configurations are: ## Custom configurations -**TODO** use sparc ini config. +You can configure the setup for SPARC-X-API using either of the +following methods: +1. (Recommended) use the [ASE configuration file](https://wiki.fysik.dtu.dk/ase/ase/calculators/calculators.html#calculator-configuration) +2. Use environmental variables. Please note although environmental + variables have long been the standard approach to set + ASE-calculators, they may be obsolete in future releases of ASE. -Please check the following steps for detailed setup: +```{note} +The environmental variables will have **higher priority** than the equivalent fields in the configure file, if both are set. +``` + +### Editing the configuration file + +ASE will look for a configuration file at `~/.config/ase/config.ini` +for package-specific settings. The configuration file follows the [INI +format](https://en.wikipedia.org/wiki/INI_file), where key-value pairs +are grouped in sections. An example of the SPARC-specific section may look like follows: + +```{code} ini +[sparc] +; `command`: full shell command (include MPI directives) to run SPARC calculation +; has the same effect as `ASE_SPARC_COMMAND` +command = srun -n 24 ~/bin/sparc + +; `psp_path`: directory containing pseudopotential files +; has the same effect as `SPARC_PSP_PATH` +psp_path = ~/dev_SPARC/psps +; `doc_path`: directory for SPARC LaTeX documentation +; has the same effect as `SPARC_DOC_PATH` +doc_path = ~/dev_SPARC/doc/.LaTeX/ +``` + +The available options in the configuration file are: +1. SPARC command: either use `command` to set a full shell string to + run the SPARC program, or use the combination of `sparc_exe` and + `mpi_prefix`. See [SPARC command configuration](#sparc-cmd-setting) + for more details. +2. JSON schema settings (*Optional*): use either `json_schema` to + define a custom JSON schema file, or `doc_path` for parsing the + LaTeX documentation on-the-fly. See [JSON schema + configuration](#json-schema-setting) for more details. +3. Pseudopotential settings (*Optional*): use `psp_path` for the + location of pseudopotential files. See [pseudopotential files settings](#pseudopot-setting) + for more details. + + +You can overwrite the location of the configuration file by the +environmental variable `ASE_CONFIG_PATH`. + +(json-schema-setting)= ### JSON schema Each version of SPARC-X-API ships with a JSON schema compatible with a @@ -31,19 +78,23 @@ dated-version of SPARC C/C++ code. You can find that version in the README badge If that does not match your local SPARC version, you can configure the location of JSON schema using one of the following methods: -1. Set the `$SPARC_DOC_PATH` variable +#### Option 1. Parse LaTeX documentation on-the-fly -`$SPARC_DOC_PATH` will direct SPARC-X-API to look for a local directory containing LaTeX documentation, for example: +The environment variable `SPARC_DOC_PATH` (equivalent to `doc_path` field in configuration file) will direct SPARC-X-API to look for a local directory containing LaTeX documentation to parse on-the-fly, for example: ```bash export SPARC_DOC_PATH=/doc/.LaTeX ``` -and parse the JSON schema on-the-fly. +or configuration file setting: +```{code} ini +doc_path: /doc/.LaTeX +``` -2. Use your own `parameters.json` + +#### 2. Use your own `parameters.json` In some cases an experimental feature may not have been updated in the -official doc. You can create and edit your own `parameters.json` file to -temporarily test a local version of SPARC: +official doc. You can create and edit your own `parameters.json` file +to temporarily test a local version of SPARC: First parse the LaTeX files into `parameters.json` ```bash @@ -72,66 +123,31 @@ Then add / edit missing parameters in the `parameters` section in }, ``` -**TODO** make sure the json parameters are included. -Finally, set up **TODO**. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - +Finally, set the `json_schema` field in the configuration file to the +newly generated json file, for example: +```{code} ini +[sparc] +; `json_schema`: custom schema file parsed from LaTeX documentation +json_schema: ~/SPARC/parameters.json +``` +```{warning} +`json_schema` and `doc_path` fields cannot be both present in the configuration file! +``` +(pseudopot-setting)= ### Pseudopotential files -Pseudopotential files (in `Abinit` [psp8 format](https://docs.abinit.org/developers/psp8_info/) are loaded in the -following order: - - - - - - - -To specify a custom path for your pseudopotential files (in Abinit [psp8 format]()), -use the environment variable `$SPARC_PSP_PATH` or `$SPARC_PP_PATH` variable: +To specify a custom path for your pseudopotential files (in Abinit [psp8 format](https://docs.abinit.org/developers/psp8_info/#:~:text=The%20format%208%20for%20ABINIT,great%20flexibility%20in%20doing%20so.), +you can either use the environment variable `SPARC_PSP_PATH`: ```bash export SPARC_PSP_PATH=path/to/your/psp8/directory ``` - -**TODO** we should sunset the `SPARC_PP_PATH` +or the equivalent keyword `psp_path` in configuration file +```{code} ini +psp_path: path/to/your/psp8/directory +``` When installing SPARC [via `conda-forge`](#use-conda), `$SPARC_PSP_PATH` is already included in the activate script of the @@ -143,27 +159,38 @@ To determine the location of default psp8 files (as in [manual pip installation] python -c "from sparc.common import psp_dir; print(psp_dir)" ``` - -### SPARC Command Configuration +(sparc-cmd-setting)= +### SPARC command configuration The command to execute SPARC calculations is determined based on the following priority: 1. The command argument provided directly to the `sparc.SPARC` calculator. -2. The environment variable `$ASE_SPARC_COMMAND` +2. Command defined by environment variable `ASE_SPARC_COMMAND` or configuration file 3. If neither of the above is defined, `SPARC-X-API` looks for the SPARC binary under current `$PATH` and combine with the suitable MPI command prefix. -Example to set `$ASE_SPARC_COMMAND` +#### Use full command + +The variable `ASE_SPARC_COMMAND` (or `command` field in configuration file) +contain the *full command* to run a SPARC calculation. depending on the system, you may choose one of the following: 1. Using `mpirun` (e.g. on a single test machine) ```bash export ASE_SPARC_COMMAND="mpirun -n 8 -mca orte_abort_on_non_zero_status 1 /path/to/sparc -name PREFIX" ``` +or in configuration file +```{code} ini +command: mpirun -n 8 -mca orte_abort_on_non_zero_status 1 /path/to/sparc -name PREFIX +``` 2. Using `srun` (e.g. [SLURM](https://slurm.schedmd.com/documentation.html) job system HPCs) ```bash export ASE_SPARC_COMMAND="srun -n 8 --kill-on-bad-exit /path/to/sparc -name PREFIX" ``` +or in configuration file +```{code} ini +command: srun -n 8 --kill-on-bad-exit /path/to/sparc -name PREFIX" /path/to/sparc -name PREFIX +``` ```{note} 1. The `-name PREFIX` is optional and will automatically replaced by the `sparc.SPARC` calculator. @@ -171,6 +198,10 @@ export ASE_SPARC_COMMAND="srun -n 8 --kill-on-bad-exit /path/to/sparc -name PREF 2. We recommend adding kill switches for your MPI commands like the examples above when running `sparc` to avoid unexpected behaviors with exit signals. ``` +#### Specifying MPI binary location + +It is also possible to construct the SPARC command from two + ## Post-installation check We recommend the users to run a simple test after installation and @@ -260,7 +291,7 @@ sparc_socket_compatibility: False -------------------------------------------------------------------------------- Please check additional information from: 1. SPARC's documentation: https://github.com/SPARC-X/SPARC/blob/master/doc/Manual.pdf -2. Python API documentation: https://github.com/alchem0x2A/SPARC-X-API/blob/master/README.md +2. Python API documentation: https://github.com/SPARC-X/SPARC-X-API/blob/master/README.md diff --git a/sparc/calculator.py b/sparc/calculator.py index 258fdc8d..29f89324 100644 --- a/sparc/calculator.py +++ b/sparc/calculator.py @@ -10,6 +10,10 @@ import psutil from ase.atoms import Atoms from ase.calculators.calculator import Calculator, FileIOCalculator, all_changes + +# 2024-11-28: @alchem0x2a add support for ase.config +# In the first we only use cfg as parser for configurations +from ase.config import cfg as _cfg from ase.parallel import world from ase.stress import full_3x3_to_voigt_6_stress from ase.units import Bohr, GPa, Hartree, eV @@ -121,7 +125,10 @@ def __init__( socket_params (dict): Parameters to control the socket behavior. Please check default_socket_params **kwargs: Additional keyword arguments to set up the calculator. """ - self.validator = locate_api(json_file=sparc_json_file, doc_path=sparc_doc_path) + # 2024-11-28 @alchem0x2a added cfg as the default validator + self.validator = locate_api( + json_file=sparc_json_file, doc_path=sparc_doc_path, cfg=self.cfg + ) self.valid_params = {} self.special_params = {} self.inpt_state = {} # Store the inpt file states @@ -140,6 +147,7 @@ def __init__( if label is None: label = "SPARC" if restart is None else None + # Use psp dir from user input or env self.sparc_bundle = SparcBundle( directory=Path(self.directory), mode="w", @@ -147,6 +155,7 @@ def __init__( label=label, # The order is tricky here. Use label not self.label psp_dir=psp_dir, validator=self.validator, + cfg=self.cfg, ) # Try restarting from an old calculation and set results @@ -444,30 +453,64 @@ def _make_command(self, extras=""): 2024.09.05 @alchem0x2a Note in ase>=3.23 the FileIOCalculator.command will fallback to self._legacy_default_command, which we should set to invalid value for now. + + 2024.11.28 @alchem0x2a + Make use of the ase.config to set up the command """ if isinstance(extras, (list, tuple)): extras = " ".join(extras) else: extras = extras.strip() - if (self.command is None) or (self.command == SPARC._legacy_default_command): - command_env = os.environ.get("ASE_SPARC_COMMAND", None) - if command_env is None: - sparc_exe, mpi_exe, num_cores = _find_default_sparc() - if sparc_exe is None: - raise EnvironmentError( - "Cannot find your sparc setup via $ASE_SPARC_COMMAND, SPARC.command, or " - "infer from your $PATH. Please refer to the manual!" - ) - if mpi_exe is not None: - command_env = f"{mpi_exe} -n {num_cores} {sparc_exe}" - else: - command_env = f"{sparc_exe}" - warn( - f"Your sparc command is inferred to be {command_env}, " - "If this is not correct, " - "please manually set $ASE_SPARC_COMMAND or SPARC.command!" + + print(self.command) + + # User-provided command (and properly initialized) should have + # highest priority + if (self.command is not None) and ( + self.command != SPARC._legacy_default_command + ): + return f"{self.command} {extras}" + + parser = self.cfg.parser["sparc"] if "sparc" in self.cfg.parser else {} + # Get sparc command from either env variable or ini + command_env = self.cfg.get("ASE_SPARC_COMMAND", None) or parser.get( + "command", None + ) + + # Get sparc binary and mpi-prefix (alternative) + sparc_exe = parser.get("sparc_exe", None) + mpi_prefix = parser.get("mpi_prefix", None) + if (sparc_exe is None) != (mpi_prefix is None): + raise ValueError( + "Both 'sparc_exe' and 'mpi_prefix' must be specified together, " + "or neither should be set in the configuration." + ) + if command_env and sparc_exe: + raise ValueError( + "Cannot set both sparc_command and sparc_exe in the config ini file!" + ) + + if sparc_exe: + command_env = f"{mpi_prefix} {sparc_exe}" + + # Fallback + if command_env is None: + sparc_exe, mpi_exe, num_cores = _find_default_sparc() + if sparc_exe is None: + raise EnvironmentError( + "Cannot find your sparc setup via $ASE_SPARC_COMMAND, SPARC.command, or " + "infer from your $PATH. Please refer to the dmanual!" ) - self.command = command_env + if mpi_exe is not None: + command_env = f"{mpi_exe} -n {num_cores} {sparc_exe}" + else: + command_env = str(sparc_exe) + warn( + f"Your sparc command is inferred to be {command_env}, " + "If this is not correct, " + "please manually set $ASE_SPARC_COMMAND or SPARC.command!" + ) + self.command = command_env return f"{self.command} {extras}" def check_input_atoms(self, atoms): @@ -639,7 +682,10 @@ def _calculate_with_socket( ret["forces"][self.resort], self.results["forces"] ).all(), "Force values from socket communication and output file are different! Please contact the developers." except KeyError: - print("Force values cannot be accessed via the results dictionary. They may not be available in the output file. Ensure PRINT_FORCES: 1\nResults:\n",self.results) + print( + "Force values cannot be accessed via the results dictionary. They may not be available in the output file. Ensure PRINT_FORCES: 1\nResults:\n", + self.results, + ) # For stress information, we make sure that the stress is always present if "stress" not in self.results: virial_from_socket = ret.get("virial", np.zeros(6)) diff --git a/sparc/io.py b/sparc/io.py index c682e468..b7fd11ba 100644 --- a/sparc/io.py +++ b/sparc/io.py @@ -8,6 +8,7 @@ import numpy as np from ase.atoms import Atoms from ase.calculators.singlepoint import SinglePointDFTCalculator +from ase.config import cfg as _cfg # various io formatters from .api import SparcAPI @@ -21,10 +22,10 @@ from .sparc_parsers.out import _read_out from .sparc_parsers.pseudopotential import copy_psp_file, parse_psp8_header from .sparc_parsers.static import _add_cell_info, _read_static -from .utils import deprecated, locate_api, string2index +from .utils import deprecated, locate_api, sanitize_path, string2index # from .sparc_parsers.ion import read_ion, write_ion -defaultAPI = locate_api() +defaultAPI = locate_api(cfg=_cfg) class SparcBundle: @@ -85,6 +86,7 @@ def __init__( label=None, psp_dir=None, validator=defaultAPI, + cfg=_cfg, ): """ Initializes a SparcBundle for accessing SPARC calculation data. @@ -114,6 +116,7 @@ def __init__( self.init_inputs = {} self.psp_data = {} self.raw_results = {} + self.cfg = cfg self.psp_dir = self.__find_psp_dir(psp_dir) # Sorting should be consistent across the whole bundle! self.sorting = None @@ -182,9 +185,14 @@ def __find_psp_dir(self, psp_dir=None): return Path(psp_dir) else: for var in self.psp_env: - env_psp_dir = os.environ.get(var, None) + env_psp_dir = self.cfg.get(var, None) if env_psp_dir: return Path(env_psp_dir) + # Use pp_path field in cfg + parser = self.cfg.parser["sparc"] if "sparc" in self.cfg.parser else {} + psp_dir_ini = parser.get("psp_path", None) + if psp_dir_ini: + return sanitize_path(psp_dir_ini) # At this point, we try to use the psp files bundled with sparc if is_psp_download_complete(default_psp_dir): return default_psp_dir diff --git a/sparc/quicktest.py b/sparc/quicktest.py index 9863d5f8..8b4cdf88 100644 --- a/sparc/quicktest.py +++ b/sparc/quicktest.py @@ -407,7 +407,7 @@ def main(): cprint( "Please check additional information from:\n" "1. SPARC's documentation: https://github.com/SPARC-X/SPARC/blob/master/doc/Manual.pdf \n" - "2. Python API documentation: https://github.com/alchem0x2A/SPARC-X-API/blob/master/README.md\n", + "2. Python API documentation: https://sparc-x.github.io/SPARC-X-API\n", color=None, ) diff --git a/sparc/utils.py b/sparc/utils.py index c475904f..6384ed55 100644 --- a/sparc/utils.py +++ b/sparc/utils.py @@ -19,6 +19,9 @@ import numpy as np import psutil +# 2024-11-28 @alchem0x2a add config +from ase.config import cfg as _cfg + from .api import SparcAPI from .docparser import SparcDocParser @@ -149,41 +152,82 @@ def cprint(content, color=None, bold=False, underline=False, **kwargs): return -def locate_api(json_file=None, doc_path=None): - """Find the default api in the following order - 1) User-provided json file path - 2) User-provided path to the doc - 3) If none of the above is provided, try to use SPARC_DOC_PATH - 4) Fallback to the as-shipped json api +def sanitize_path(path_string): + """Sanitize path containing string in UNIX systems + Returns a PosixPath object + + It is recommended to use this sanitize function + before passing any path-like strings from cfg parser + """ + if isinstance(path_string, str): + path = os.path.expandvars(os.path.expanduser(path_string)) + path = Path(path).resolve() + else: + path = Path(path_string).resolve() + return path + + +def locate_api(json_file=None, doc_path=None, cfg=_cfg): """ - if json_file is not None: - api = SparcAPI(json_file) - return api + Locate the SPARC API setup file with the following priority: + 1) If `json_file` is provided (either from parameter or cfg), use it directly. + 2) If `doc_path` is provided: + a) Function parameter takes precedence. + b) Environment variable SPARC_DOC_PATH comes next. + c) Configuration section [sparc] in the ini file is the last resort. + 3) If both `json_file` and `doc_path` are provided, raise an exception. + 4) Fallback to the default API setup if neither is provided. + """ + parser = cfg.parser["sparc"] if "sparc" in cfg.parser else {} + if not json_file: + json_file = parser.get("json_schema") if parser else None + + # Environment variable SPARC_DOC_PATH can overwrite user settings + if not doc_path: + doc_path = cfg.get("SPARC_DOC_PATH") - if doc_path is None: - doc_path = os.environ.get("SPARC_DOC_PATH", None) + if not doc_path: + doc_path = parser.get("doc_path") if parser else None - if (doc_path is not None) and Path(doc_path).is_dir(): + json_file = sanitize_path(json_file) if json_file else None + doc_path = sanitize_path(doc_path) if doc_path else None + + # Step 4: Ensure mutual exclusivity + if json_file and doc_path: + raise ValueError( + "Cannot set both the path of json file and documentation" + "at the same time!" + ) + + if json_file: + if not json_file.is_file(): + raise FileNotFoundError(f"JSON file '{json_file}' does not exist.") + return SparcAPI(json_file) + + if doc_path: + if not doc_path.is_dir(): + raise FileNotFoundError( + f"Documentation path '{doc_path}' does not exist or is not a directory." + ) try: with tempfile.TemporaryDirectory() as tmpdir: - tmpdir = Path(tmpdir) - tmpfile = tmpdir / "parameters.json" + tmpfile = Path(tmpdir) / "parameters.json" with open(tmpfile, "w") as fd: fd.write( SparcDocParser.json_from_directory( - Path(doc_path), include_subdirs=True + doc_path, include_subdirs=True ) ) api = SparcAPI(tmpfile) - api.source["path"] = Path(doc_path).resolve().as_posix() - api.source["type"] = "latex" + api.source = {"path": str(doc_path.resolve()), "type": "latex"} return api except Exception as e: - warn(f"Cannot load JSON schema from env {doc_path}, the error is {e}.") - pass + raise RuntimeError( + f"Failed to load API from documentation path '{doc_path}': {e}" + ) - api = SparcAPI() - return api + # Fallback to default API + return SparcAPI() # Utilities taken from vasp_interactive project