Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port SFM model back from the legacy branch #4

Open
oesteban opened this issue Apr 14, 2021 · 11 comments
Open

Port SFM model back from the legacy branch #4

oesteban opened this issue Apr 14, 2021 · 11 comments
Assignees

Comments

@oesteban
Copy link
Member

oesteban commented Apr 14, 2021

cf. nipreps/eddymotion#15.

@josephmje
Copy link

Isotropic fit can be used for single-shell data but for multi-shell data, will need to use exponential isotropic fit. Dipy should have some code for detecting whether data is single or multi-shell data. @arokem mentioned it's being used in the DKI model to test whether the data is eligible.

@arokem
Copy link

arokem commented Dec 7, 2021

This is the function that checks if data are multi-b: https://github.com/dipy/dipy/blob/be956a529465b28085f8fc435a756947ddee1c89/dipy/core/gradients.py#L825

@josephmje
Copy link

Reporting some weirdness with the SFM model currently:

If including a mask, run into memory issues, even on the small IXI dataset:

MemoryError: Unable to allocate 735. GiB for an array with shape (128, 128, 56, 128, 56, 15) and data type float64

If no mask, run into some looping issues.

AttributeError                            Traceback (most recent call last)
<ipython-input-15-c2464056eea0> in <module>
----> 1 em_affines = EddyMotionEstimator.fit(dmri_dataset, model="SFM", omp_nthreads=16)

/mnt/tigrlab/scratch/mjoseph/eddymotion/eddymotion/estimator.py in fit(dwdata, n_iter, align_kwargs, model, omp_nthreads, seed, **kwargs)
     97 
     98                         # fit the model
---> 99                         dwmodel.fit(data_train[0])
    100 
    101                         # generate a synthetic dw volume for the test gradient

/mnt/tigrlab/scratch/mjoseph/eddymotion/eddymotion/model.py in fit(self, data, **kwargs)
     90 
     91             try:
---> 92                 self._model = loop.run_until_complete(
     93                     asyncio.gather(*fit_tasks))
     94             finally:

/scratch/mjoseph/.pyenv/versions/3.8.1/envs/ohbm_venv/lib/python3.8/site-packages/nest_asyncio.py in run_until_complete(self, future)
     79                 raise RuntimeError(
     80                     'Event loop stopped before Future completed.')
---> 81             return f.result()
     82 
     83     def _run_once(self):

/scratch/mjoseph/.pyenv/versions/3.8.1/lib/python3.8/concurrent/futures/thread.py in run(self)
     55 
     56         try:
---> 57             result = self.fn(*self.args, **self.kwargs)
     58         except BaseException as exc:
     59             self.future.set_exception(exc)

/mnt/tigrlab/scratch/mjoseph/eddymotion/eddymotion/model.py in _model_fit(model, data)
    430 
    431 def _model_fit(model, data):
--> 432     return model.fit(data)

/scratch/mjoseph/.pyenv/versions/3.8.1/envs/ohbm_venv/lib/python3.8/site-packages/dipy/reconst/sfm.py in fit(self, data, mask)
    472                 with warnings.catch_warnings():
    473                     warnings.simplefilter("ignore")
--> 474                     flat_params[vox] = self.solver.fit(self.design_matrix,
    475                                                        fit_it).coef_
    476 

/scratch/mjoseph/.pyenv/versions/3.8.1/envs/ohbm_venv/lib/python3.8/site-packages/sklearn/linear_model/_coordinate_descent.py in fit(self, X, y, sample_weight, check_input)
   1061             coef_[k] = this_coef[:, 0]
   1062             dual_gaps_[k] = this_dual_gap[0]
-> 1063             self.n_iter_.append(this_iter[0])
   1064 
   1065         if n_targets == 1:

AttributeError: 'int' object has no attribute 'append'

@arokem
Copy link

arokem commented Dec 15, 2021

What's the shape of the mask? Shape of the data?

@josephmje
Copy link

Mask is (128, 128, 56) and data is (128, 128, 56, 15)

@arokem
Copy link

arokem commented Dec 15, 2021

That makes sense. Do you happen to have some information on which line raises that memory error? Is it in this codebase, or coming from deeper in DIPY?

@arokem
Copy link

arokem commented Dec 15, 2021

For the looping issue, could you please put in a breakpoint before this line?

99                         dwmodel.fit(data_train[0])

What is data_train[0] at this point? An array, I assume? What shape does it have?

@josephmje
Copy link

josephmje commented Dec 15, 2021

That makes sense. Do you happen to have some information on which line raises that memory error? Is it in this codebase, or coming from deeper in DIPY?

        # Apply mask (ensures data is now 2D)
        data = data[self._mask, ...]

Within this codebase. Double-checking to see whether I defined the mask properly in SparseFascicleModel.

@josephmje
Copy link

josephmje commented Dec 15, 2021

What is data_train[0] at this point? An array, I assume? What shape does it have?

data_train[0] is an array. I think it will always be the original dataset that gets updated with each iteration.

@arokem
Copy link

arokem commented Dec 15, 2021

I worry that it somehow become a 1-dimensional array (i.e., data from a single voxel), which often needs special handling, because it no longer has the spatial dimensions.

@arokem
Copy link

arokem commented Dec 15, 2021

I would put in a breakpoint and do a sanity check on the shape and dtype of the data and mask variables right before that memory error. Assuming the mask and data are what you put in there, this code should be fine. But I wonder whether something flattens the data or the mask into a 2D array, before you get to that line, or something like that.

@oesteban oesteban transferred this issue from nipreps/eddymotion Dec 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants