Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fftshift for FourierOp #143

Merged
merged 5 commits into from
Jan 26, 2024
Merged

Fftshift for FourierOp #143

merged 5 commits into from
Jan 26, 2024

Conversation

ckolbPTB
Copy link
Collaborator

No description provided.

@ckolbPTB ckolbPTB requested a review from koflera January 17, 2024 16:58
@ckolbPTB
Copy link
Collaborator Author

@JoHa0811 and @schuenke: it would be great if you could test this with your data to make sure everything is now nicely centered.

Copy link
Contributor

github-actions bot commented Jan 17, 2024

Coverage

Coverage Report
FileStmtsMissCoverMissing
src/mrpro/algorithms
   _remove_readout_os.py30293%46, 48
src/mrpro/data
   _AccelerationFactor.py13130%15–36
   _AcqInfo.py64198%93
   _DcfData.py78396%71, 97, 178
   _IData.py57788%43, 48, 109, 114, 122, 128, 132
   _IHeader.py58297%86, 113
   _KData.py86693%83–84, 126, 153, 158–159
   _KHeader.py125993%34, 81, 119, 170, 181, 188–189, 192, 199
   _KNoise.py21957%48–61
   _SpatialDimension.py39587%74, 77, 84–86
   _TrajectoryDescription.py14193%35
src/mrpro/data/traj_calculators
   _KTrajectoryCalculator.py26388%37, 63, 65
   _KTrajectoryIsmrmrd.py15287%55, 62
   _KTrajectoryPulseq.py33197%72
src/mrpro/operators
   _FourierOp.py112695%43–50, 205, 255
   _LinearOperator.py21386%28, 47, 52
   _ZeroPadOp.py15193%45
src/mrpro/utils
   _rgetattr.py3167%20
   _zero_pad_or_crop.py32681%37, 41, 64, 67, 70, 73
TOTAL14518194% 

Tests Skipped Failures Errors Time
104 0 💤 0 ❌ 0 🔥 32.429s ⏱️

@schuenke
Copy link
Collaborator

@JoHa0811 and @schuenke: it would be great if you could test this with your data to make sure everything is now nicely centered.

I tested the following cases that worked fine:

  1. 2D dynamic cartesian pulseq data using KTrajectoryCartesian calculator
  2. 2D dynamic radial pulseq data using KTrajectoryRadial2D calculator

I tried to test KTrajectoryPulseq as well, but it is not working at all currently even after fixing #141 and #144

tests/utils/test_fft.py Outdated Show resolved Hide resolved
tests/utils/test_fft.py Outdated Show resolved Hide resolved
src/mrpro/operators/_FourierOp.py Show resolved Hide resolved
src/mrpro/utils/fft.py Outdated Show resolved Hide resolved
@ckolbPTB ckolbPTB mentioned this pull request Jan 21, 2024
@ckolbPTB ckolbPTB requested a review from koflera January 25, 2024 10:01
src/mrpro/utils/_change_data_shape.py Outdated Show resolved Hide resolved
src/mrpro/operators/_PadOp.py Outdated Show resolved Hide resolved
src/mrpro/operators/_PadOp.py Outdated Show resolved Hide resolved
tests/utils/test_change_data_shape.py Outdated Show resolved Hide resolved
tests/utils/test_change_data_shape.py Outdated Show resolved Hide resolved
tests/utils/test_change_data_shape.py Outdated Show resolved Hide resolved
import torch.nn.functional as F


def change_data_shape(dat: torch.Tensor, dat_shape_new: tuple[int, ...]) -> torch.Tensor:
Copy link
Member

@fzimmermann89 fzimmermann89 Jan 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

couple of possible issues:

  1. name sounds like a reshape operation.
  2. consider data instead of data
  3. consider dat_shape_new -> new_shape (or I might call you yoda)
  4. what if data has a different number of dimensions than the new shape?
Suggested change
def change_data_shape(dat: torch.Tensor, dat_shape_new: tuple[int, ...]) -> torch.Tensor:
def pad_or_crop(data: torch.Tensor, new_shape: tuple[int, ...]|torch.Size) -> torch.Tensor:
"""Change shape of data by cropping or zero-padding.
Parameters
----------
data
data
new_shape
target shape of data
Returns
-------
data padded or cropped to shape
"""
if len(new_shape) != data.ndim:
raise ValueError("length of shape should match dimensions of data")
npad=[]
for old, new in zip(data.shape, new_shape):
diff = new-old
# ensures padding and cropping lead to the matching asymmetry for odd shape differences
before = math.trunc(diff/2)
after = diff - before
npad.append(before)
npad.append(after)
# npad has to be reversed because pad expects it in reversed order
if any(npad):
data = F.pad(data, npad[::-1])
return data

Or, including the along_dimension functionality:

Suggested change
def change_data_shape(dat: torch.Tensor, dat_shape_new: tuple[int, ...]) -> torch.Tensor:
import torch
import torch.nn.functional as F
import numpy as np
import math
def normalize_index(ndim: int, index:int):
"""Normlize possibly negative indices
Parameters
----------
ndim
number of dimensions
index
index to normalize. negative indices count from the end.
Raises
------
IndexError
if index is outside [-ndim,ndim)
"""
if 0<index<ndim:
return index
elif -ndim<=index<0:
return ndim + index
else:
raise IndexError(f"Invalid index {index} for {ndim} data dimensions")
def pad_or_crop(data: torch.Tensor, new_shape: tuple[int, ...]|torch.Size, dim:None|tuple[int,...]=None) -> torch.Tensor:
"""Change shape of data by cropping or zero-padding.
Parameters
----------
data
data
new_shape
desired shape of data
dim:
dimensions the new_shape corropsoends to. None (default) is interpreted as last len(new_shape) dimensions.
Returns
-------
data padded or cropped to shape
"""
if len(new_shape) > data.ndim:
raise ValueError("length of new shape should not exceed dimensions of data")
if dim is None: # Use last dimensions
new_shape = data.shape[:-len(new_shape)] + new_shape
else:
if len(new_shape) != len(dim):
raise ValueError("length of shape should match length of dim")
dim = tuple(normalize_index(idx) for idx in dim) # raises if any not in [-data.ndim,data.ndim)
if len(dim)!=len(set(dim)): # this is why we normalize
raise ValueError("repeated values are not allowed in dims")
(new_shape:=torch.tensor(data.shape))[dim]=new_shape
npad=[]
for old, new in zip(data.shape, new_shape):
diff = new-old
after = math.trunc(diff/2)
before = diff - after
npad.append(before)
npad.append(after)
if any(npad):
# F.pad expects paddings in reversed order
data = F.pad(data, npad[::-1])
return data

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I incorporated your second suggestion with the dimension option to be able to use it direclty in the PadOp

tests/operators/test_zero_pad_op.py Outdated Show resolved Hide resolved
tests/operators/test_zero_pad_op.py Outdated Show resolved Hide resolved
tests/operators/test_fast_fourier_op.py Outdated Show resolved Hide resolved
@ckolbPTB ckolbPTB merged commit 75b144b into main Jan 26, 2024
4 checks passed
@ckolbPTB ckolbPTB deleted the fft_shift branch January 26, 2024 16:32
dim = tuple(normalize_index(data.ndim, idx) for idx in dim) # raises if any not in [-data.ndim,data.ndim)
if len(dim) != len(set(dim)): # this is why we normalize
raise ValueError('repeated values are not allowed in dims')
new_shape_full = torch.tensor(data.shape)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this a tensor if you don't use tuple indexing in the next line?

if len(dim) != len(set(dim)): # this is why we normalize
raise ValueError('repeated values are not allowed in dims')
new_shape_full = torch.tensor(data.shape)
for i, d in enumerate(dim):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there a reason for this loop instead of just indexing with dims?

new_shape_full[d] = new_shape[i]

npad = []
for old, new in zip(torch.tensor(data.shape), new_shape_full):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are the shapes tensors anyway?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

otherwise the difference in the line below does not work

@fzimmermann89
Copy link
Member

the padding is broken for dims=None.

fzimmermann89 pushed a commit that referenced this pull request Nov 10, 2024
* FastFourierOp and PadOp added
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants