Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ENH: Add static type checking #28

Open
wants to merge 21 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
1071733
ENH: Add static type checking
jhlegarreta Dec 20, 2024
605c8c3
DOC: Fix spelling mistakes detected by `codespell`
jhlegarreta Dec 20, 2024
d4f92d7
ENH: Remove empty `latex_elements` section in documentation config file
jhlegarreta Dec 21, 2024
ff6d76a
ENH: Fix arguments to `unique` in GP error analysis plot script
jhlegarreta Dec 21, 2024
19dbde7
ENH: Convert list to array prior to computing mean and std dev
jhlegarreta Dec 21, 2024
5c5609a
BUG: Use appropriate keyword arg names to instantiate `SphericalKriging`
jhlegarreta Dec 21, 2024
18bd5b5
ENH: Select appropriate element in case `predict` returns a tuple
jhlegarreta Dec 21, 2024
33e9b6d
ENH: Group keyword arguments into a single dictionary
jhlegarreta Dec 21, 2024
7216dd6
ENH: Use `str` instead of `Path` for `_parse_yaml_config` parameter
jhlegarreta Dec 21, 2024
ccaeae3
ENH: Use arrays in NumPy's `percentile` arguments
jhlegarreta Dec 21, 2024
ddfcbad
ENH: List `sigma_sq` in the GP model slots
jhlegarreta Dec 21, 2024
80b71f5
ENH: Add the dimensionality to the `mask` ndarray parameter annotation
jhlegarreta Dec 21, 2024
3c6ce8e
ENH: Fix type hint for `figsize` parameter
jhlegarreta Dec 21, 2024
662f7b5
ENH: Provide appropriate type hints to `reg_target_type`
jhlegarreta Dec 21, 2024
de2b304
ENH: Instantiate `namedtuple` using the correct syntax for `ImageFrid`
jhlegarreta Dec 21, 2024
681016f
ENH: Remove type hint in overriden `diag` methods
jhlegarreta Dec 21, 2024
f772f33
ENH: Import `Bounds` from `scipy.optimize`
jhlegarreta Dec 21, 2024
5831f8f
ENH: Avoid type checking for private function import statement
jhlegarreta Dec 21, 2024
ade884c
ENH: Remove unused `namedtuple` definition in test
jhlegarreta Dec 21, 2024
4b4caab
ENH: Use `ClassVar` for class variable type hinting
jhlegarreta Dec 21, 2024
470c510
ENH: Annotate `optimizer` attribute type DiffusionGPR
jhlegarreta Dec 22, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,3 +84,28 @@ jobs:
with:
files: cov.xml
token: ${{ secrets.CODECOV_TOKEN }}

checks:
runs-on: 'ubuntu-latest'
continue-on-error: true
strategy:
matrix:
check: ['spellcheck', 'typecheck']

steps:
- uses: actions/checkout@v4
- name: Install the latest version of uv
uses: astral-sh/setup-uv@v4
# Can remove this once there is a traits release that supports 3.13
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: 3.12
- name: Install tox
run: uv tool install tox --with=tox-uv
- name: Show tox config
run: tox c
- name: Show tox config (this call)
run: tox c -e ${{ matrix.check }}
- name: Run check
run: tox -e ${{ matrix.check }}
2 changes: 1 addition & 1 deletion README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ and positron-emission tomography (PET) data.
the known directions and strengths of diffusion gradients*. J Magn Reson Imaging **24**:1188-1193.

.. [4] Andersson et al. (2012) *A comprehensive Gaussian Process framework for correcting distortions
and movements in difussion images*. In: 20th SMRT & 21st ISMRM, Melbourne, Australia.
and movements in diffusion images*. In: 20th SMRT & 21st ISMRM, Melbourne, Australia.

.. [5] Andersson & Sotiropoulos (2015) *Non-parametric representation and prediction of single- and
multi-shell diffusion-weighted MRI data using Gaussian processes*. NeuroImage **122**:166-176.
Expand Down
15 changes: 0 additions & 15 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,21 +154,6 @@

# -- Options for LaTeX output ------------------------------------------------

latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}

# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
Expand Down
19 changes: 19 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,15 @@ test = [
"pytest-env",
"pytest-xdist >= 1.28"
]
types = [
"pandas-stubs",
"types-setuptools",
"scipy-stubs",
"types-PyYAML",
"types-tqdm",
"pytest",
"microsoft-python-type-stubs @ git+https://github.com/microsoft/python-type-stubs.git",
]

antsopt = [
"ConfigSpace",
Expand Down Expand Up @@ -122,6 +131,16 @@ version-file = "src/nifreeze/_version.py"
# Developer tool configurations
#

[[tool.mypy.overrides]]
module = [
"nipype.*",
"nilearn.*",
"nireports.*",
"nitransforms.*",
"seaborn",
]
ignore_missing_imports = true

[tool.ruff]
line-length = 99
target-version = "py310"
Expand Down
16 changes: 12 additions & 4 deletions scripts/dwi_gp_estimation_error_analysis_plot.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,10 +89,18 @@ def main() -> None:
df = pd.read_csv(args.error_data_fname, sep="\t", keep_default_na=False, na_values="n/a")

# Plot the prediction error
kfolds = sorted(np.unique(df["n_folds"].values))
snr = np.unique(df["snr"].values).item()
bval = np.unique(df["bval"].values).item()
rmse_data = [df.groupby("n_folds").get_group(k)["rmse"].values for k in kfolds]
kfolds = sorted(pd.unique(df["n_folds"]))
snr = pd.unique(df["snr"])
if len(snr) == 1:
snr = snr[0]
else:
raise ValueError(f"More than one unique SNR value: {snr}")
bval = pd.unique(df["bval"])
if len(bval) == 1:
bval = bval[0]
else:
raise ValueError(f"More than one unique bval value: {bval}")
rmse_data = np.asarray([df.groupby("n_folds").get_group(k)["rmse"].values for k in kfolds])
axis = 1
mean = np.mean(rmse_data, axis=axis)
std_dev = np.std(rmse_data, axis=axis)
Expand Down
8 changes: 5 additions & 3 deletions scripts/dwi_gp_estimation_simulated_signal.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,11 +132,11 @@ def main() -> None:

# Fit the Gaussian Process regressor and predict on an arbitrary number of
# directions
a = 1.15
lambda_s = 120
beta_a = 1.15
beta_l = 120
alpha = 100
gpr = DiffusionGPR(
kernel=SphericalKriging(a=a, lambda_s=lambda_s),
kernel=SphericalKriging(beta_a=beta_a, beta_l=beta_l),
alpha=alpha,
optimizer=None,
)
Expand All @@ -154,6 +154,8 @@ def main() -> None:
X_test = np.vstack([gtab[~gtab.b0s_mask].bvecs, sph.vertices])

predictions = gpr_fit.predict(X_test)
if isinstance(predictions, tuple):
predictions = predictions[0]

# Save the predicted data
testsims.serialize_dwi(predictions.T, args.dwi_pred_data_fname)
Expand Down
5 changes: 3 additions & 2 deletions scripts/optimize_registration.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,12 +126,13 @@ async def train_coro(
moving_path = tmp_folder / f"test-{index:04d}.nii.gz"
(~xfm).apply(refnii, reference=refnii).to_filename(moving_path)

_kwargs = {"output_transform_prefix": f"conversion-{index:04d}", **align_kwargs}

cmdline = erants.generate_command(
fixed_path,
moving_path,
fixedmask_path=brainmask_path,
output_transform_prefix=f"conversion-{index:04d}",
**align_kwargs,
**_kwargs,
)

tasks.append(
Expand Down
4 changes: 2 additions & 2 deletions src/nifreeze/cli/parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,13 +29,13 @@
import yaml


def _parse_yaml_config(file_path: Path) -> dict:
def _parse_yaml_config(file_path: str) -> dict:
"""
Parse YAML configuration file.

Parameters
----------
file_path : Path
file_path : str
Path to the YAML configuration file.

Returns
Expand Down
5 changes: 2 additions & 3 deletions src/nifreeze/data/dmri.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,9 +89,8 @@ def __len__(self):

def set_transform(self, index, affine, order=3):
"""Set an affine, and update data object and gradients."""
reference = namedtuple("ImageGrid", ("shape", "affine"))(
shape=self.dataobj.shape[:3], affine=self.affine
)
ImageGrid = namedtuple("ImageGrid", ("shape", "affine"))
reference = ImageGrid(shape=self.dataobj.shape[:3], affine=self.affine)

# create a nitransforms object
if self.fieldmap:
Expand Down
8 changes: 6 additions & 2 deletions src/nifreeze/data/filtering.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,12 @@ def advanced_clip(
# Calculate stats on denoised version to avoid outlier bias
denoised = median_filter(data, footprint=ball(3))

a_min = np.percentile(denoised[denoised >= 0] if nonnegative else denoised, p_min)
a_max = np.percentile(denoised[denoised >= 0] if nonnegative else denoised, p_max)
a_min = np.percentile(
np.asarray([denoised[denoised >= 0] if nonnegative else denoised]), p_min
)
a_max = np.percentile(
np.asarray([denoised[denoised >= 0] if nonnegative else denoised]), p_max
)

# Clip and scale data
data = np.clip(data, a_min=a_min, a_max=a_max)
Expand Down
5 changes: 2 additions & 3 deletions src/nifreeze/data/pet.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,9 +71,8 @@

def set_transform(self, index, affine, order=3):
"""Set an affine, and update data object and gradients."""
reference = namedtuple("ImageGrid", ("shape", "affine"))(
shape=self.dataobj.shape[:3], affine=self.affine
)
ImageGrid = namedtuple("ImageGrid", ("shape", "affine"))
reference = ImageGrid(shape=self.dataobj.shape[:3], affine=self.affine)

Check warning on line 75 in src/nifreeze/data/pet.py

View check run for this annotation

Codecov / codecov/patch

src/nifreeze/data/pet.py#L74-L75

Added lines #L74 - L75 were not covered by tests
xform = Affine(matrix=affine, reference=reference)

if not Path(self._filepath).exists():
Expand Down
4 changes: 3 additions & 1 deletion src/nifreeze/model/_dipy.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@
from __future__ import annotations

import warnings
from typing import Any

import numpy as np
from dipy.core.gradients import GradientTable
Expand Down Expand Up @@ -87,6 +88,7 @@ class GaussianProcessModel(ReconstModel):
__slots__ = (
"kernel",
"_modelfit",
"sigma_sq",
)

def __init__(
Expand Down Expand Up @@ -137,7 +139,7 @@ def fit(
self,
data: np.ndarray,
gtab: GradientTable | np.ndarray,
mask: np.ndarray[bool] | None = None,
mask: np.ndarray[bool, Any] | None = None,
random_state: int = 0,
) -> GPFit:
"""Fit method of the DTI model class
Expand Down
14 changes: 8 additions & 6 deletions src/nifreeze/model/gpr.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,11 +25,11 @@
from __future__ import annotations

from numbers import Integral, Real
from typing import Callable, Mapping, Sequence
from typing import Callable, ClassVar, Mapping, Optional, Sequence, Union

import numpy as np
from scipy import optimize
from scipy.optimize._minimize import Bounds
from scipy.optimize import Bounds
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import (
Hyperparameter,
Expand Down Expand Up @@ -153,7 +153,9 @@ class DiffusionGPR(GaussianProcessRegressor):

"""

_parameter_constraints: dict = {
optimizer: Optional[Union[StrOptions, Callable, None]] = None

_parameter_constraints: ClassVar[dict] = {
"kernel": [None, Kernel],
"alpha": [Interval(Real, 0, None, closed="left"), np.ndarray],
"optimizer": [StrOptions(SUPPORTED_OPTIMIZERS), callable, None],
Expand Down Expand Up @@ -212,7 +214,7 @@ def _constrained_optimization(
) -> tuple[float, float]:
options = {}
if self.optimizer == "fmin_l_bfgs_b":
from sklearn.utils.optimize import _check_optimize_result
from sklearn.utils.optimize import _check_optimize_result # type: ignore

for name in LBFGS_CONFIGURABLE_OPTIONS:
if (value := getattr(self, name, None)) is not None:
Expand Down Expand Up @@ -332,7 +334,7 @@ def __call__(

return self.beta_l * C_theta, K_gradient

def diag(self, X: np.ndarray) -> np.ndarray:
def diag(self, X) -> np.ndarray:
"""Returns the diagonal of the kernel k(X, X).

The result of this method is identical to np.diag(self(X)); however,
Expand Down Expand Up @@ -442,7 +444,7 @@ def __call__(

return self.beta_l * C_theta, K_gradient

def diag(self, X: np.ndarray) -> np.ndarray:
def diag(self, X) -> np.ndarray:
"""Returns the diagonal of the kernel k(X, X).

The result of this method is identical to np.diag(self(X)); however,
Expand Down
9 changes: 5 additions & 4 deletions src/nifreeze/registration/ants.py
Original file line number Diff line number Diff line change
Expand Up @@ -389,7 +389,7 @@
str(p) for p in _massage_mask_path(movingmask_path, nlevels)
]

# Set initalizing affine if provided
# Set initializing affine if provided
if init_affine is not None:
settings["initial_moving_transform"] = str(init_affine)

Expand All @@ -413,7 +413,7 @@
i_iter: int,
vol_idx: int,
dirname: Path,
reg_target_type: str,
reg_target_type: str | tuple[str, str],
align_kwargs: dict,
) -> nt.base.BaseTransform:
"""
Expand Down Expand Up @@ -443,7 +443,7 @@
DWI frame index.
dirname : :obj:`Path`
Directory name where the transformation is saved.
reg_target_type : :obj:`str`
reg_target_type : :obj:`str` or tuple of :obj:`str`
Target registration type.
align_kwargs : :obj:`dict`
Parameters to configure the image registration process.
Expand Down Expand Up @@ -472,7 +472,8 @@
registration.inputs.fixed_image_masks = ["NULL", bmask_img]

if em_affines is not None and np.any(em_affines[vol_idx, ...]):
reference = namedtuple("ImageGrid", ("shape", "affine"))(shape=shape, affine=affine)
ImageGrid = namedtuple("ImageGrid", ("shape", "affine"))
reference = ImageGrid(shape=shape, affine=affine)

Check warning on line 476 in src/nifreeze/registration/ants.py

View check run for this annotation

Codecov / codecov/patch

src/nifreeze/registration/ants.py#L475-L476

Added lines #L475 - L476 were not covered by tests

# create a nitransforms object
if fieldmap:
Expand Down
2 changes: 1 addition & 1 deletion src/nifreeze/testing/simulations.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ def add_b0(bvals: np.ndarray, bvecs: np.ndarray) -> tuple[np.ndarray, np.ndarray

def create_single_fiber_evecs(theta: float = 0, phi: float = 0) -> np.ndarray:
"""
Create eigenvectors for a simulated fiber given the polar coordinates of its pricipal axis.
Create eigenvectors for a simulated fiber given the polar coordinates of its principal axis.

Parameters
----------
Expand Down
2 changes: 1 addition & 1 deletion src/nifreeze/viz/signals.py
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ def plot_error(
ylabel: str,
title: str,
color: str = "orange",
figsize: tuple[int, int] = (19.2, 10.8),
figsize: tuple[float, float] = (19.2, 10.8),
) -> plt.Figure:
"""
Plot the error and standard deviation.
Expand Down
4 changes: 0 additions & 4 deletions test/test_gpr.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,17 +20,13 @@
#
# https://www.nipreps.org/community/licensing/
#
from collections import namedtuple

import numpy as np
import pytest
from dipy.io import read_bvals_bvecs

from nifreeze.model import gpr

GradientTablePatch = namedtuple("gtab", ["bvals", "bvecs"])


THETAS = np.linspace(0, np.pi / 2, num=50)
EXPECTED_EXPONENTIAL = [
1.0,
Expand Down
9 changes: 9 additions & 0 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,15 @@ extras = doc
commands =
make -C docs/ SPHINXOPTS="-W -v" BUILDDIR="$HOME/docs" OUTDIR="${CURBRANCH:-html}" html

[testenv:typecheck]
description = Run mypy type checking
labels = check
deps =
mypy
extras = types
commands =
mypy .

[testenv:spellcheck]
description = Check spelling
labels = check
Expand Down
Loading