neurodatascience / dFC

An implementation of several well-known dynamic Functional Connectivity assessment methods.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

demo does not work

Remi-Gau opened this issue · comments

Had some issues when downloading the data:

Had to add double quotes around file names to make it run: probably because I am using zsh as my shell.

Note that this may make it hard for windows users to run the demo because curl has a different syntax on windows I think.

!curl --create-dirs "https://s3.amazonaws.com/openneuro.org/ds002785/derivatives/fmriprep/sub-0001/func/sub-0001_task-restingstate_acq-mb3_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz?versionId=UfCs4xtwIEPDgmb32qFbtMokl_jxLUKr" -o "sample_data/sub-0001_task-restingstate_acq-mb3_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz"
!curl --create-dirs "https://s3.amazonaws.com/openneuro.org/ds002785/derivatives/fmriprep/sub-0001/func/sub-0001_task-restingstate_acq-mb3_desc-confounds_regressors.tsv?versionId=biaIJGNQ22P1l1xEsajVzUW6cnu1_8lD" -o "sample_data/sub-0001_task-restingstate_acq-mb3_desc-confounds_regressors.tsv"

I am then getting an error at this step:

# load sub-0001 data from nifti file
BOLD = data_loader.nifti2timeseries(
            nifti_file='sample_data/sub-0001_task-restingstate_acq-mb3_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', 
            n_rois=100, Fs=1/0.75,
            subj_id='sub-0001',
            confound_strategy='no_motion', # no_motion, no_motion_no_gsr, or none
            standardize=False,
            TS_name=None,
            session=None,
        )

BOLD.visualize( start_time=0, end_time=1000, nodes_lst=range(10))
{
	"name": "IndexError",
	"message": "only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices",
	"stack": "---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
Cell In[6], line 2
      1 # load sub-0001 data from nifti file
----> 2 BOLD = data_loader.nifti2timeseries(
      3             nifti_file='sample_data/sub-0001_task-restingstate_acq-mb3_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', 
      4             n_rois=100, Fs=1/0.75,
      5             subj_id='sub-0001',
      6             confound_strategy='no_motion', # no_motion, no_motion_no_gsr, or none
      7             standardize=False,
      8             TS_name=None,
      9             session=None,
     10         )
     12 BOLD.visualize( start_time=0, end_time=1000, nodes_lst=range(10))

File ~/github/origami/dFC/pydfc/data_loader.py:243, in nifti2timeseries(nifti_file, n_rois, Fs, subj_id, confound_strategy, standardize, TS_name, session)
    225 def nifti2timeseries(
    226         nifti_file, 
    227         n_rois, Fs,
   (...)
    232         session=None,
    233     ):
    234     '''
    235     this function is only for single subject and single session data loading 
    236     it uses nilearn maskers to extract ROI signals from nifti files
   (...)
    241     {100, 200, 300, 400, 500, 600, 700, 800, 900, 1000}
    242     '''
--> 243     time_series, labels, locs = nifti2array(
    244         nifti_file=nifti_file, 
    245         confound_strategy=confound_strategy, 
    246         standardize=standardize,
    247         n_rois=n_rois
    248     )
    250     assert type(locs) is np.ndarray, 'locs must be a numpy array'
    251     assert type(labels) is list, 'labels must be a list'

File ~/github/origami/dFC/pydfc/data_loader.py:205, in nifti2array(nifti_file, confound_strategy, standardize, n_rois)
    199 elif confound_strategy=='no_motion':
    200     confounds_simple, sample_mask = load_confounds(
    201         nifti_file,
    202         strategy=[\"high_pass\", \"motion\", \"wm_csf\"],
    203         motion=\"basic\", wm_csf=\"basic\"
    204     )
--> 205     time_series = masker.fit_transform(
    206         nifti_file,
    207         confounds=confounds_simple,
    208         sample_mask=sample_mask
    209     )
    210 elif confound_strategy=='no_motion_no_gsr':
    211     confounds_simple, sample_mask = load_confounds(
    212         nifti_file,
    213         strategy=[\"high_pass\", \"motion\", \"wm_csf\", \"global_signal\"],
    214         motion=\"basic\", wm_csf=\"basic\", global_signal=\"basic\"
    215     )

File ~/miniconda3/lib/python3.11/site-packages/sklearn/utils/_set_output.py:140, in _wrap_method_output.<locals>.wrapped(self, X, *args, **kwargs)
    138 @wraps(f)
    139 def wrapped(self, X, *args, **kwargs):
--> 140     data_to_wrap = f(self, X, *args, **kwargs)
    141     if isinstance(data_to_wrap, tuple):
    142         # only wrap the first output for cross decomposition
    143         return (
    144             _wrap_data_with_container(method, data_to_wrap[0], X, self),
    145             *data_to_wrap[1:],
    146         )

File ~/github/nilearn/nilearn/nilearn/maskers/nifti_labels_masker.py:531, in NiftiLabelsMasker.fit_transform(self, imgs, confounds, sample_mask)
    500 def fit_transform(self, imgs, confounds=None, sample_mask=None):
    501     \"\"\"Prepare and perform signal extraction from regions.
    502 
    503     Parameters
   (...)
    529 
    530     \"\"\"
--> 531     return self.fit(imgs).transform(
    532         imgs, confounds=confounds, sample_mask=sample_mask
    533     )

File ~/miniconda3/lib/python3.11/site-packages/sklearn/utils/_set_output.py:140, in _wrap_method_output.<locals>.wrapped(self, X, *args, **kwargs)
    138 @wraps(f)
    139 def wrapped(self, X, *args, **kwargs):
--> 140     data_to_wrap = f(self, X, *args, **kwargs)
    141     if isinstance(data_to_wrap, tuple):
    142         # only wrap the first output for cross decomposition
    143         return (
    144             _wrap_data_with_container(method, data_to_wrap[0], X, self),
    145             *data_to_wrap[1:],
    146         )

File ~/github/nilearn/nilearn/nilearn/maskers/base_masker.py:267, in BaseMasker.transform(self, imgs, confounds, sample_mask)
    264     else:
    265         all_confounds.append(confounds)
--> 267 return self.transform_single_imgs(
    268     imgs, confounds=all_confounds, sample_mask=sample_mask
    269 )

File ~/github/nilearn/nilearn/nilearn/maskers/nifti_labels_masker.py:703, in NiftiLabelsMasker.transform_single_imgs(self, imgs, confounds, sample_mask)
    700     region_ids[i] = ids[i]
    702 if self.labels is not None:
--> 703     self.region_names_ = {
    704         key: self.labels[region_id]
    705         for key, region_id in region_ids.items()
    706         if region_id != self.background_label
    707     }
    708 else:
    709     self.region_names_ = None

File ~/github/nilearn/nilearn/nilearn/maskers/nifti_labels_masker.py:704, in <dictcomp>(.0)
    700     region_ids[i] = ids[i]
    702 if self.labels is not None:
    703     self.region_names_ = {
--> 704         key: self.labels[region_id]
    705         for key, region_id in region_ids.items()
    706         if region_id != self.background_label
    707     }
    708 else:
    709     self.region_names_ = None

IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices"
}

@mtorabi59 : did you have a look at @Remi-Gau issue ? I think this needs to fix that before the paper is released

@Remi-Gau @jbpoline I solved the second the issue by setting the package versions in setup.py. The first issue can be solved by users I guess.

Which issue can be solved by users ? the curl download on Windows ?

@jbpoline yes, if I change it (add double quotations) then it won't work for mac. So I think Windows users can change the curl syntax and then run it for now. I will address this in future.

My suggestion would be to rely either on the AWS CLI or datalad as suggested in the openneuro download page : https://openneuro.org/datasets/ds002785/versions/1.0.0/download

this way you are sure that you don't have an OS dependent issue.