dpeerlab / Palantir

Single cell trajectory detection

Home Page:https://palantir.readthedocs.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error "at least one array or dtype is required" when running run_palantir

emanuelavilla opened this issue · comments

Hello!

I am trying to run the command run_palantir, but I get an error. I am starting with a Seurat object, that I have converted into AnnData previously. Then, I run:

_# Run diffusion maps
pca_projections = pd.DataFrame(ad.obsm['X_pca'], index=ad.obs_names)
dm_res = palantir.utils.run_diffusion_maps(pca_projections, n_components=5)

ms_data = palantir.utils.determine_multiscale_space(dm_res)

#Not in the vignette because it is an issue to be fixed. This is a temporary solution
from scipy.sparse import csr_matrix
ad.X = csr_matrix(ad.X)

ad.layers['MAGIC_imputed_data'] = palantir.utils.run_magic_imputation(ad, dm_res) #MAGIC imputation

start_cell = 'CCACCTATCTACGAGT_2' #It is the cell with the highest expression of Cd34
pr_res = palantir.core.run_palantir(ms_data, start_cell, num_waypoints=500)_

And the resulting error is

_---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In [22], line 2
1 start_cell = 'CCACCTATCTACGAGT_2'
----> 2 pr_res = palantir.core.run_palantir(ms_data, start_cell, num_waypoints=500)

File ~/miniconda3/envs/Palantir_traj/lib/python3.10/site-packages/palantir/core.py:58, in run_palantir(ms_data, early_cell, terminal_states, knn, num_waypoints, n_jobs, scale_components, use_early_cell_as_start, max_iterations)
42 """Function for max min sampling of waypoints
43
44 :param ms_data: Multiscale space diffusion components
(...)
53 :return: PResults object with pseudotime, entropy, branch probabilities and waypoints
54 """
56 if scale_components:
57 data = pd.DataFrame(
---> 58 preprocessing.minmax_scale(ms_data),
59 index=ms_data.index,
60 columns=ms_data.columns,
61 )
62 else:
63 data = copy.copy(ms_data)

File ~/miniconda3/envs/Palantir_traj/lib/python3.10/site-packages/sklearn/preprocessing/_data.py:615, in minmax_scale(X, feature_range, axis, copy)
541 """Transform features by scaling each feature to a given range.
542
543 This estimator scales and translates each feature individually such
(...)
611 <sphx_glr_auto_examples_preprocessing_plot_all_scaling.py>`.
612 """
613 # Unlike the scaler object, this function allows 1d input.
614 # If copy is required, it will be done inside the scaler object.
--> 615 X = check_array(
616 X, copy=False, ensure_2d=False, dtype=FLOAT_DTYPES, force_all_finite="allow-nan"
617 )
618 original_ndim = X.ndim
620 if original_ndim == 1:

File ~/miniconda3/envs/Palantir_traj/lib/python3.10/site-packages/sklearn/utils/validation.py:768, in check_array(array, accept_sparse, accept_large_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, ensure_min_samples, ensure_min_features, estimator, input_name)
764 pandas_requires_conversion = any(
765 _pandas_dtype_needs_early_conversion(i) for i in dtypes_orig
766 )
767 if all(isinstance(dtype_iter, np.dtype) for dtype_iter in dtypes_orig):
--> 768 dtype_orig = np.result_type(*dtypes_orig)
770 if dtype_numeric:
771 if dtype_orig is not None and dtype_orig.kind == "O":
772 # if input is object, convert to float.

File <array_function internals>:180, in result_type(*args, **kwargs)

ValueError: at least one array or dtype is required_

Do you have any suggestions on the reason why the command does not work?

Emanuela

Can you please check the dimensionality of ms_data.

Hi all,

I have encounter the same issue. The ms_data has zero columns. So I increased the n_components when running palantir.utils.run_diffusion_maps and solved the problem. Hope this helps!

Kind regards,
Yuyao

I met this error too, how does that happen ?