Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MDMR issue with bigmemory_CCleanindices #9

Open
maxwellelliott opened this issue Jun 7, 2019 · 4 comments
Open

MDMR issue with bigmemory_CCleanindices #9

maxwellelliott opened this issue Jun 7, 2019 · 4 comments

Comments

@maxwellelliott
Copy link

maxwellelliott commented Jun 7, 2019

Below is my call to connectir_mdmr.R and the output. I have tried to track down the error and it seems to be an issue with CCleanIndices in bigextensions/R/utils.R . I think that bigmemory has updated something since this fix 2 years ago (czarrar/bigextensions@14a11f8). I tried to fix it myself but can't figure it out.

/dcc/Scripts/Tools/CWAS/connectir_mdmr.R -i /dcc/Projects/Max/DBIS/pFactor/CWAS/subdist.n746_060719 --formula 'sex + p' --model /dcc/Projects/Max/DBIS/pFactor/lists/model_746_wFunc_p38_060719.csv --factors2perm 'p' --memlimit 5 -c 1 --save-perms -p 100 --ignoreprocerror --skip-subdist-check --debug mdmr.n746_060719

Loading niftir version < 1.0.

Setting 1 parallel forks
Setting 1 threads for matrix algebra operations
Checking options
Setting up inputs
...data dimensions
...formula
...factors to permute
...factors2perm
...model
...creating output directory '/dcc/Projects/Max/DBIS/pFactor/CWAS/subdist.n746_060719/mdmr.n746_060719'
Reading in subject distances
...checking input
Determining MDMR memory demands
...1.1 MB used for permutation indices
...0.0 MB used for p-values
...minimum of 17.0 MB used for distance matrices
...minimum of 0.0 MB used for permuted pseudo-F stats
...minimum of 8.5 MB used for permuted model matrices
...minimum of 8.5 MB used for permuted error matrices
...minimum of 0.0 MB used for temporary matrices
...memory limit is 5.00 GB and a minimum of 0.043 GB is needed
...setting super-block size to 376 (out of 376 voxels)
...setting block size to 101 (out of 101 permutations)
...3.98 GB of RAM will be used
Generating hat matrices and like (with mdmr_model)
...creating right-hand design matrix
...calculating QR decomposition
p
2
[1] 1
...checking/adjusting for rank deficiencies
...creating the hat matrices (H2s and IHs)
...calculating degrees of freedom
...combining elements to return
Preparing data
...preparing permutation matrices
Gathering permuted observation indices
...for factor p (#1)
|======================================================================| 100%
...preparing file-backed pesudo-F matrices
Computing MDMR across 1 large blocks
...with 1 smaller blocks within each larger one
large block 1
...grabbing subset of Gower matrices
...preparing partial pesudo-F matrices
...looping through 1 permutation block(s)
| | 0%...preparing subset of permutation indices
Generating permuted hat matrices (H2s)
...for factor p (#1)
Generating permuted hat matrices (IHs)
...for factor p (#1)
...calculating Pseudo-F stats
...removing permutation indices
Error in .Call("bigmemory_CCleanIndices", as.double(rows), as.double(nr), :
"bigmemory_CCleanIndices" not available for .Call() for package "bigmemory"

An error was detected:
with piece 1:
[1] 1

Called by:
NULL

Saving options...

Removing everything from memory
...sucesss

@czarrar
Copy link
Owner

czarrar commented Jun 13, 2019

I do not actively maintain the package anymore. Did you try installing an older version of bigmemory: 4.4.6?

Also did you use voxelwise or parcels? If you used parcels, I could recommend a native R version of MDMR.

@maxwellelliott
Copy link
Author

Got it. Thanks for the advice.

I reinstalled an older version of bigmemory and the command worked again!

@dseok
Copy link

dseok commented Oct 5, 2019

Which version of bigmemory did you end up installing? I tried 4.4.6 and 4.4.5 and I'm still getting the same error you are.

@maxwellelliott
Copy link
Author

maxwellelliott commented Oct 9, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants