Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] Vector-checking utilities #26

Merged
merged 13 commits into from
Nov 5, 2019
Merged

Conversation

dPys
Copy link
Collaborator

@dPys dPys commented Oct 16, 2019

Per Issue #24
FR: Conformity check of bvecs and bvals (pure-python implementation)

Todo:

  1. reportlet viz and description of sampling scheme.
  2. Parameterized smoke tests for each type of corrupted or incompatible bvec/bval file.

Comments/Suggestions welcome.
@dPys

Copy link
Collaborator

@arokem arokem left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be great to have tests here!

dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
@dPys
Copy link
Collaborator Author

dPys commented Oct 16, 2019

Thanks for the quick review @arokem . Will make all requested changes and add tests by early next week!

Cheers,
@dPys

@dPys
Copy link
Collaborator Author

dPys commented Oct 21, 2019

@arokem -- basic tests are now included in dmriprep/utils/tests

@dPys dPys added the enhancement New feature or request label Oct 21, 2019
Copy link
Member

@oesteban oesteban left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this contribution. I'm missing a few things:

  • A Nipype interface that can be embedded in the workflow.
  • More test cases of bvec/bval examples which include intentional bad cases and edge cases (e.g. bvecs with norm very close to 1 and very close to 0, but none of those).
  • More clear delineation of responsibilities: more functions with atomic functionality. As an exception, I believe it could be much more readable and robust to do all the rescalings (bvec and bval) in just one function.

dmriprep/.idea/workspace.xml Outdated Show resolved Hide resolved
dmriprep/utils/tests/test_vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/tests/test_vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/tests/test_vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/tests/test_vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
@oesteban
Copy link
Member

BTW - happy to help as much as I can sending you PRs to your branch!

@dPys
Copy link
Collaborator Author

dPys commented Oct 22, 2019

Thanks so much @oesteban for going through this PR so carefully! Will make the requested changes and get back to you in a day or so.

@dPys

@dPys
Copy link
Collaborator Author

dPys commented Oct 23, 2019

@oesteban and @arokem --
The PR is now updated with most of the requested changes and then some.
Additional changes:
*Function for writing the gtab out to .tsv.
*Tests now use sherbrooke_3shell with dipy's fetcher so that we don't have to include raw data in the package.
*CheckVecs is now a nipype interface that lives in interfaces/vectors.py.
*Significantly more modularity has been achieved so that separate routines live in separate functions.

Remaining to-do's:
-Create a series of synthetic corrupted bval/bvecs to implement parametric pytesting. Where should these files be stored or should I write functions to create them on-the-fly?
-I'd like to still expand upon image_gradient_consistency_check to iterate across volumes of the dwi and maybe somehow check that volume indices with higher signal contrast correspond to B0 indices?

@dPys dPys force-pushed the enh/vectors branch 2 times, most recently from 976a317 to 4e0233f Compare October 23, 2019 08:25
Copy link
Member

@oesteban oesteban left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking much much better, please keep up with the good work!

dmriprep/interfaces/vectors.py Outdated Show resolved Hide resolved
dmriprep/interfaces/vectors.py Outdated Show resolved Hide resolved
dmriprep/interfaces/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
dmriprep/interfaces/vectors.py Outdated Show resolved Hide resolved
dmriprep/interfaces/vectors.py Outdated Show resolved Hide resolved
@dPys
Copy link
Collaborator Author

dPys commented Oct 26, 2019

Looking much much better, please keep up with the good work!

Glad to hear it.

@oesteban
Copy link
Member

one question to @arokem, @garikoitz, et al.:

I guess here we are also missing the conversion between image indices coordinates and RAS coordinates for the bvecs, is that correct?

I believe FSL and MRTrix use image coordinates - what is the case for dipy?

@garikoitz
Copy link
Collaborator

garikoitz commented Oct 28, 2019 via email

@dPys
Copy link
Collaborator Author

dPys commented Oct 28, 2019

I think FSL is image coordinates but mrtrix is scanner (real) coordinates Wasn’t this part of Derek’s PR?

On Sun, 27 Oct 2019 at 16:00, Oscar Esteban @.***> wrote: one question to @arokem https://github.com/arokem, @garikoitz https://github.com/garikoitz, et al.: I guess here we are also missing the conversion between image indices coordinates and RAS coordinates for the bvecs, is that correct? I believe FSL and MRTrix use image coordinates - what is the case for dipy? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#26?email_source=notifications&email_token=ABCZAV2XEGFCVESD3HPV6XDQQYMSHA5CNFSM4JBQDZL2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOECLKN6I#issuecomment-546744057>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABCZAV3USWDXOX75U566AA3QQYMSHANCNFSM4JBQDZLQ .

Well, I mean in my original PR, I had included the following:

def reorient_image_and_vecs(dwi_file, fbvec, out_dir, out_orient='RAS'):
    """
    A function to reorient any dwi image and associated bvecs to a given orientation.

    Parameters
    ----------
    dwi_file : str
        File path to a dwi Nifti1Image.
    fbvec : str
        File path to corresponding bvecs file.
    out_dir : str
        Path to output directory.
    out_orient : str
        An orientation towards which to reorient both the input dwi image and
        its accompanying bvecs. RAS, LAS, and LPS are supported. Default is RAS+.

    Returns
    -------
    out_fname : str
        File path to the reoriented dwi Nifti1Image.
    out_bvec_fname : str
        File path to corresponding reoriented bvecs file.
    """
    from dmriprep.interfaces.images import normalize_xform
    from nipype.utils.filemanip import fname_presuffix

    out_fname = fname_presuffix(dwi_file, newpath=out_dir, suffix="%s%s" % ('_', out_orient), use_ext=True)
    out_bvec_fname = fname_presuffix(fbvec, newpath=out_dir, suffix="%s%s" % ('_', out_orient), use_ext=True)

    input_img = nib.load(dwi_file)
    input_axcodes = nib.aff2axcodes(input_img.affine)
    reoriented = nib.as_closest_canonical(input_img)
    normalized = normalize_xform(reoriented)

    if out_orient == 'RAS':
        new_axcodes = ('R', 'A', 'S')
    elif out_orient == 'LAS':
        new_axcodes = ('L', 'A', 'S')
    elif out_orient == 'LPS':
        new_axcodes = ('L', 'P', 'S')
    else:
        raise ValueError('Orientation not available.')

    if normalized is not input_img:
        print("%s%s%s%s%s%s%s" % ('Reorienting ', dwi_file, ' and ', fbvec, ' to ', out_orient, '...'))

        # Flip the bvecs
        input_orientation = nib.orientations.axcodes2ornt(input_axcodes)
        desired_orientation = nib.orientations.axcodes2ornt(new_axcodes)
        transform_orientation = nib.orientations.ornt_transform(input_orientation, desired_orientation)
        bvec_array = np.loadtxt(fbvec)
        if bvec_array.shape[0] != 3:
            bvec_array = bvec_array.T
        if not bvec_array.shape[0] == transform_orientation.shape[0]:
            raise ValueError("Unrecognized bvec format")
        output_array = np.zeros_like(bvec_array)
        for this_axnum, (axnum, flip) in enumerate(transform_orientation):
            output_array[this_axnum] = bvec_array[int(axnum)] * float(flip)
        np.savetxt(out_bvec_fname, output_array, fmt="%.8f ")

        normalized.to_filename(out_fname)
    else:
        import shutil
        shutil.copy2(dwi_file, out_fname)
        shutil.copy2(fbvec, out_bvec_fname)

    return out_fname, out_bvec_fname

Basically, if the dwi image is in RAS, then we leave the bvec alone. If the image is in any other orientation, we reorient it to RAS and invert the corresponding axes of the bvecs that require flipping to get there. I was under the impression that pretty much all tools use image index coordinates as the convention? Is that not the case?

If so, and if we do plan to impose RAS+ orientation in dmriprep in general, it should definitely happen near the beginning of the workflow, and the bvec should be corrected accordingly...

@mattcieslak
Copy link

I took a similar approach for reorienting images and bvecs (https://github.com/PennBBL/qsiprep/blob/master/qsiprep/interfaces/images.py#L358). If you're planning on using ants for most of the registration, then keeping things in LPS+ makes life a lot easier. Internally their world coordinate system is oriented this way and it makes operations like local rotations much easier. The downside is that if you're using smriprep there will be a lot of internal image axis flipping.

@dPys dPys force-pushed the enh/vectors branch 2 times, most recently from 04e97e4 to ae3a0f4 Compare October 28, 2019 05:27
@dPys
Copy link
Collaborator Author

dPys commented Oct 28, 2019

I took a similar approach for reorienting images and bvecs (https://github.com/PennBBL/qsiprep/blob/master/qsiprep/interfaces/images.py#L358). If you're planning on using ants for most of the registration, then keeping things in LPS+ makes life a lot easier. Internally their world coordinate system is oriented this way and it makes operations like local rotations much easier. The downside is that if you're using smriprep there will be a lot of internal image axis flipping.

@mattcieslak -- see if you like the updated version above. It now permits any of RAS/LAS/LPS as user-specified output orientations, rather than hard-coding it to any particular one. I do see what you mean about LPS being more ANTs friendly. Any ideas why the ANTs people haven't accepted RAS+ as a standard? As you mentioned, it makes things a bit trickier for us (plus maybe also risks confusion on the part of the user?) if we go with LPS and then have to keep flipping everything back...

@dPys
Copy link
Collaborator Author

dPys commented Oct 28, 2019

@oesteban --

I've made the requested changes plus some further enhancements:

  1. I've created a class per your suggestion called VectorTools which now contains each of 8 different methods of io and data integrity checks. This should be much more concise than the original function that I was relying on before.
  2. A vector reorientation tool (that is also a method of VectorTools) to force the dwi and bvecs into a given image orientation with mutual correspondence (I default to RAS+ given our choice of a rasb tsv file, though see @mattcieslak 's comment above).
  3. I've started to add a number of synthetic corrupted bvecs/bvals for testing the integrity checkers. This part already raised the need for a tad more error handling, which I'll have to get to at some point soon.
  4. I've fleshed out the core interface (now called CleanVecs), and added some switches for us to have better control over what the interface is doing each time we call it in workflows.
  5. I've also added a few routines of my own for doing dwi/bval consistency checking -- basically all that's happening here is that I'm evaluating B0 locations from the image using relative mean signal variation on the basis of 3 sd's above the mean across all 4d volumes. This essentially leverages the fact that the low B0 should always have comparatively way higher SNR than gradient-encoded volumes. I can then compare these indices with the indices specified by the bval (and found in the bvec rows) and ensure they correspond. If they don't, then all that happens is a UserWarning gets raised. Similarly, I go on to use sklearn's MeanShift to perform clustering of the mean signal intensities across image volumes to ensure that the number of clusters correspond to the number of unique shells + B0's. This is a WIP (honestly surprised that I've never seen any other software do this), but I'm convinced that we should include something like it as a part of this vector-checking interface. It's also essentially instantaneous (~1 sec runtime total). Let me know what you think.
  6. We still need to PR the gradient table rasb .tsv file io routines to nibabel per your suggestion. And we still need to come up with a json reader/writer for the gradient table metadata.

@dPys

@oesteban
Copy link
Member

@dPys Basically, if the dwi image is in RAS, then we leave the bvec alone. If the image is in any other orientation, we reorient it to RAS and invert the corresponding axes of the bvecs that require flipping to get there. I was under the impression that pretty much all tools use image index coordinates as the convention? Is that not the case?

Why the DWIs must be in RAS+ coords? (poking @mattcieslak for this one, as he has more experience) I think how you access the data array is independent of the coordinates of the bvecs. If you have an interface that provides you both RASB and voxel coords (after all your checks), then nothing prevents you from grabbing the output you really need down the line.

If so, and if we do plan to impose RAS+ orientation in dmriprep in general, it should definitely happen near the beginning of the workflow, and the bvec should be corrected accordingly...

I would try to avoid reorienting images by all means. If we still want to do it (and believe me, I cry when I find a NIfTI with an orientation other than RAS+), then that should be COMPLETELY independent of this PR. To calculate RASB you just need the affine of the image.

@mattcieslak
I took a similar approach for reorienting images and bvecs (https://github.com/PennBBL/qsiprep/blob/master/qsiprep/interfaces/images.py#L358). If you're planning on using ants for most of the registration, then keeping things in LPS+ makes life a lot easier. Internally their world coordinate system is oriented this way and it makes operations like local rotations much easier. The downside is that if you're using smriprep there will be a lot of internal image axis flipping.

ITK/ANTs have a problem with NIfTIs define their orientation with the s-form matrix. In fMRIPrep we run a series of checks on the NIfTI headers, and make sure they all have q-form and s-form matrices that are consistent. Then you just need to be careful after any ITK-based processing as s-forms will not be updated (and thus, downstream steps will likely get orientation wrong).

This is potentially going to get much much better when we finish https://github.com/poldracklab/nitransforms (which btw is coming along pretty fast).

@dPys

  • I've created a class per your suggestion called VectorTools which now contains each of 8 different methods of io and data integrity checks. This should be much more concise than the original function that I was relying on before.

I'll have a look ASAP

  • A vector reorientation tool (that is also a method of VectorTools) to force the dwi and bvecs into a given image orientation with mutual correspondence (I default to RAS+ given our choice of a rasb tsv file, though see @mattcieslak 's comment above).

Great! I'll give this all my love. I hope there are some nice tests there :D. Please grab from nitransfroms whatever you need, it's going to become a dependency anyways (better said, it will be part of nibabel).

  • I've started to add a number of synthetic corrupted bvecs/bvals for testing the integrity checkers. This part already raised the need for a tad more error handling, which I'll have to get to at some point soon.

Sounds good. I'll probably have to send a PR to your branch with stuff related to the above two points, so I'll have this in mind and try to help.

  • I've fleshed out the core interface (now called CleanVecs), and added some switches for us to have better control over what the interface is doing each time we call it in workflows.

👍

  • I've also added a few routines of my own for doing dwi/bval consistency checking -- basically all that's happening here is that I'm evaluating B0 locations from the image using relative mean signal variation on the basis of 3 sd's above the mean across all 4d volumes. This essentially leverages the fact that the low B0 should always have comparatively way higher SNR than gradient-encoded volumes. I can then compare these indices with the indices specified by the bval (and found in the bvec rows) and ensure they correspond. If they don't, then all that happens is a UserWarning gets raised. Similarly, I go on to use sklearn's MeanShift to perform clustering of the mean signal intensities across image volumes to ensure that the number of clusters correspond to the number of unique shells + B0's. This is a WIP (honestly surprised that I've never seen any other software do this), but I'm convinced that we should include something like it as a part of this vector-checking interface. It's also essentially instantaneous (~1 sec runtime total). Let me know what you think.

I think this is awesome. If you want to split this to a subsequent PR, that will simplify your life to get this one in, simplify mine when reviewing and also reduce the risk we get out of focus in the PR (lengthening the review period).

  • We still need to PR the gradient table rasb .tsv file io routines to nibabel per your suggestion.

Don't miss your sleep on this one. Let's test the waters first (WDYT, @effigies?)

And we still need to come up with a json reader/writer for the gradient table metadata.

That's already there, and it's called pyBIDS :D.

@dPys dPys closed this Oct 29, 2019
@dPys dPys reopened this Oct 29, 2019
@effigies
Copy link
Member

effigies commented Oct 29, 2019

  • We still need to PR the gradient table rasb .tsv file io routines to nibabel per your suggestion.

Don't miss your sleep on this one. Let's test the waters first (WDYT, @effigies?)

There are 83 comments in this thread. What's the specific proposal?

I can make a couple general comments, and if they don't fit what you're talking about, you can provide more context.

A pretty good model in the past has been to develop the IO routines you need in your own project, and once they're solidified, to add them to nibabel. For example, PySurfer did this with nibabel.freesurfer.io. This allows you to develop without being constrained by the nibabel release cycle.

Also, in case it's relevant, the data types have fairly heterogeneous APIs in nibabel. Images have one API, streamlines another, GIFTIs are barely related to NIfTIs, and the FreeSurfer IO functions work with dictionaries and numpy arrays, eschewing custom Python objects altogether. So don't feel obligated to contort yourself to mimic any of these APIs, if they're not natural. Figure out what makes sense for your data. (This also argues for developing locally and upstreaming later, as you can work through a few APIs if needed.)

@oesteban
Copy link
Member

There are 83 comments in this thread. What's the specific proposal?

Sorry it was overwhelming - you clearly found your way to the proposal (i/o and storage of bvecs and bvals for DWI).

I can make a couple general comments, and if they don't fit what you're talking about, you can provide more context.

Thanks, they do fit :)

A pretty good model in the past has been to develop the IO routines you need in your own project, and once they're solidified, to add them to nibabel. For example, PySurfer did this with nibabel.freesurfer.io. This allows you to develop without being constrained by the nibabel release cycle.

Also, in case it's relevant, the data types have fairly heterogeneous APIs in nibabel. Images have one API, streamlines another, GIFTIs are barely related to NIfTIs, and the FreeSurfer IO functions work with dictionaries and numpy arrays, eschewing custom Python objects altogether. So don't feel obligated to contort yourself to mimic any of these APIs, if they're not natural. Figure out what makes sense for your data. (This also argues for developing locally and upstreaming later, as you can work through a few APIs if needed.)

Great, that confirms our approach. We will first develop something that works for us trying the code to be as nibabel-ish as possible but without that becoming a stopper. Then we can see if there's interest in adding them into nibabel.

oesteban and others added 5 commits November 4, 2019 08:41
I have revised the utilities and the nipype interface, trying to
simplify the code (without missing functionality).

Some assumptions that can be made on inputs:
* RASB tables will be BIDS-valid, and hence normalized, with absolute
b-vals, etc.
* bvec+bval will be in FSL format (as mandated by BIDS), and hence in image coordinates.

In general, I've tried to remove repetition of code sections, added
doctests that will serve for documentation, minimized dependencies,
checked code style.

I haven't gone through the tests, but they will need a deep revision.
I would recommend using pytest fixtures to minimize the lines of code
and automate some clerical tasks (e.g., setting up data, changing to a
temporal folder, etc.).
@codecov-io
Copy link

codecov-io commented Nov 4, 2019

Codecov Report

Merging #26 into master will increase coverage by 14.08%.
The diff coverage is 100%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master      #26       +/-   ##
===========================================
+ Coverage   40.57%   54.66%   +14.08%     
===========================================
  Files           9       11        +2     
  Lines         589      772      +183     
  Branches       92      116       +24     
===========================================
+ Hits          239      422      +183     
  Misses        349      349               
  Partials        1        1
Impacted Files Coverage Δ
dmriprep/interfaces/vectors.py 100% <100%> (ø)
dmriprep/utils/vectors.py 100% <100%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3c5d0f1...d2bc103. Read the comment docs.

Copy link
Member

@oesteban oesteban left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is ready to go on my side. Some highlights:

  • Sherbrooke 3-shell dataset is now pulled by pytest for the whole session and the cache directory can be changed.
  • Added a few fixtures to access resources easily from both doctests and tests.
  • Encapsulated b-matrix operations in one class, happy to keep working on this.
  • Pretty high coverage with tests (see https://codecov.io/gh/nipreps/dmriprep/pull/26)
  • Hopefully sufficient documentation with docstrings.

I believe next is:

  • In-depth revision from @dPys, with two goals: i) identifying I did not leave anything out from his original implementation; ii) asking me everything that remains unclear about why I made particular changes.
  • A second pass from @arokem with dipy's eyes (although I've minimized this dependency, I think it wasn't necessary for most of the implementation).
  • Everyone else is invited to have a look. Perhaps @josephmje and @mattcieslak could check on this with an eye on their currently open PRs.

@josephmje
Copy link
Collaborator

Just a question from my end. Is the idea to use the output rasb tsv file instead of the input bvec and bval files for all downstream operations?

@dPys
Copy link
Collaborator Author

dPys commented Nov 4, 2019

Just a question from my end. Is the idea to use the output rasb tsv file instead of the input bvec and bval files for all downstream operations?

@josephmje -- I think as much as possible, yes. It is simply a more concise way to storing the vector information so that: 1) we are not restricted to relying on two separate files; and 2) we always know the orientation of the vectors relative to RAS (information that is absent from traditional bvecs alone). Now, should we choose to include FSL-based tools like EDDY (at least until dmriprep develops better ones), we would still need to augment interfacing to handle rasbtsv since those tools fundamentally rely on the bvec/bval formats -- but this shouldn't be too big a deal.

@oesteban
Copy link
Member

oesteban commented Nov 4, 2019

As it is implemented right now, you will have access to both formats at all times (RAS+B and FSL-style), so you just connect the right input to the interface.

dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
'dwi_file': dipy_datadir / "HARDI193.nii.gz",
'bvecs': np.loadtxt(dipy_datadir / "HARDI193.bvec").T,
'bvals': np.loadtxt(dipy_datadir / "HARDI193.bval"),
}


@pytest.fixture(autouse=True)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is awesome.

np.savetxt(str(path), self.gradients,
delimiter='\t', header='\t'.join('RASB'),
fmt=['%.8f'] * 3 + ['%g'])
if bvecs:
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to include a conditional here or does the shape get enforced by the interface? hard to tell

            if bvecs.shape[-1] == 3:
                np.savetxt(str(bvecs), self.bvecs.T, fmt='%.6f')
            else:
                np.savetxt(str(bvecs), self.bvecs, fmt='%.6f')

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At this point, self.bvecs has (Nx3) dimensions. Please note that bvecs here just contains a string or a Path (perhaps improving the docstring here would be nice).

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I see that now. I wonder if to avoid confusion, we should rename bvecs to fbvecs for every place we're referring to a file string/Path, and go with bvecs any time we're referring to an array?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function is called to_filename, I don't think changing the name of the arguments will change much. I guess it is more of an interface problem, i.e., should we have to_filename store only RASB and then add two more methods like bvecs_to_filename and bvals_to_filename or a write_bvec_bval(path) that adds .bval and .bvec.

Another alternative is:

to_filename(filename, filetype='rasb') and then switch to bvec/bval mode if filetype is 'fsl' I would probably like this last one better.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fair enough. Well I like the latter alternative best, but yes I think we need something like a write_bvec_bval(path) here. I sort of had something like this previously in the interface with:

if self.inputs.save_fsl_style is True:
    fbval_out_file, fbvec_out_file = vt.save_vecs_fsl()

but the to_filename would be best

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implemented the to_filename(filename, filetype='fsl') route. 👍

Copy link
Collaborator Author

@dPys dPys left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only one point noted. All else looks amazing. Nice teamwork!

dmriprep/utils/vectors.py Outdated Show resolved Hide resolved
@oesteban
Copy link
Member

oesteban commented Nov 5, 2019

Okay, i don't think it'd be premature to merge. Let's do this!

@oesteban oesteban merged commit a5122f7 into nipreps:master Nov 5, 2019
@oesteban
Copy link
Member

oesteban commented Nov 5, 2019

@dPys, please send a PR adding your name to the zenodo file. Please remember to include [skip ci] in the commit title (the first line)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants