-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Error in Participant-Level Processing with MRtrix3 Connectome Pipeline: Unable to Import Requisite Pre-Processed Data #128
Comments
The issue I believe is that you are attempting to provide what was generated by the (While I use the "muh BIDS App spec" excuse here, technically the BIDS App spec says that the analysis level should be either "
Are you taking the "minimally preprocessed" HCP data and then running it through the
There's a chance this is another slightly misleading error message. If the " |
Thank you for your previous guidance. Following your suggestions, I have made the following adjustments and retried running the MRtrix3 Connectome pipeline: Actions Taken:
Results:
Questions:
I would appreciate any advice on how to resolve this issue. Thank you for your assistance. |
I've no idea how well this tool will go executing on the HCP-YA unprocessed data. Some years back I had a student attempting to replicate the HCP-YA preprocessing and was unable to replicate data of the same quality as their provided minimally processed data, even though we were using their containerised versions of software. Gradient non-linearity geometric distortions are also likely to cause much more problems on the Connectom scanner, but that correction is currently absent from this pipeline. For the |
Thank you for your assistance. After removing the scratch directory that appeared to be left over from previous runs, the issue with FreeSurfer not finding the scratch directory was resolved. Thank you for your advice. Should I avoid using this pipeline on raw HCP data? |
I would suggest using the minimally processed HCP data unless there's a strong justification otherwise. While there's the potential for improvements in data quality compared to the original processing (eg. with PCA denoising, within-volume motion correction), in the absence of a more robust study comparing them there's no guarantee that re-processed data won't be net poorer quality. Like I mentioned, we tried this at one point and were never content with the results we got, though we didn't do an exhaustive evaluation of preprocessing parameters. There was a promise of an HCP data release with an alternative preprocessing pipeline, with an emphasis on improved susceptibility field estimation, but I've not been able to find anything about a data release: https://cds.ismrm.org/protected/22MProceedings/PDFfiles/0425.html. |
I understand now why it is better to use the minimally preprocessed data. I want to create a structural connectivity matrix using the HCP MMP1.0 atlas, but I am unsure how to do this from the minimally preprocessed HCP data. I have been looking for a method but haven't found one yet. Therefore, I am attempting to use this pipeline to process the raw data. Any advice you can provide would be greatly appreciated. |
You could potentially trick this tool into utilising the minimally preprocessed data:
You could look at this repo created by a colleague and see if it is of use. |
Thank you for your response. Regarding your suggestion to convert the "minimally preprocessed" data into a BIDS derivative format and label it as "MRtrix3_connectome-preproc," then execute it at the "participant" level, would this ensure correct processing? I intend to use the HCP MMP1.0 atlas, and I have observed that when using this atlas, FreeSurfer's "recon-all" is executed within the "participant" level processing. As far as I understand, "recon-all" is already applied within the HCP pipeline to the "minimally preprocessed" data. Is it acceptable for this process to be applied again? |
Hi @Lestropie
Description:
I encountered an error during the participant-level processing step using the MRtrix3 Connectome pipeline. The error suggests that the pipeline is unable to import the requisite pre-processed data. Below are the details of my setup and the specific error messages encountered.
Steps to Reproduce:
Environment:
Preprocessing Details:
The following command was used for the "preproc" level processing:
BIDS Conversion:
Directory Structure and Relevant Files:
The directory structure of the preprocessed data is as follows:
Additional Information:
Attached Image:
The attached image shows the alignment of the T1-weighted image and the DWI image:
Relevant Output Logs:
The following log excerpts capture the steps leading up to the error:
Questions:
Thank you for your assistance.
The text was updated successfully, but these errors were encountered: