File "pandas/_libs/parsers.pyx", line 1973, in pandas._libs.parsers.raise_parser_error pandas.errors.ParserError: Error tokenizing data. C error: EOF inside string starting at row 37
Nathanielhubert opened this issue · comments
Any idea why I am getting this error?
File "pandas/_libs/parsers.pyx", line 1973, in pandas._libs.parsers.raise_parser_error
pandas.errors.ParserError: Error tokenizing data. C error: EOF inside string starting at row 37
I tried the commands below, all resulted in the same error. In the third attempt, "CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost_f1600_s11_abbv.txt" is abbreviated to 36 rows of taxa to rule out a syntax problem in row 37.
qiime picrust2 full-pipeline
--i-table CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost.qza
--i-seq NODE-REPRESENTATIVES_format_noUnk_noChloroplast_noSize_wTaxName.qza
--output-dir CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost_PICRUSt
--verbose
picrust2_pipeline.py -s NODE-REPRESENTATIVES_format_noUnk_noChloroplast_noSize_wTaxName.fasta -i CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost_f1600_s11.txt -o CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost_f1600_s11_picrust2_2 -p 1
picrust2_pipeline.py -s NODE-REPRESENTATIVES_format_noUnk_noChloroplast_noSize_wTaxName.fasta -i CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost_f1600_s11_abbv.txt -o CNT_n1000_noCTRL_comparison4_nonwater_oldPre_newPost_f1600_s11_picrust_abbv -p 1
Any help would be greatly appreciated. Thank you, Nate
I have attached all files here. Thanks again
Archive.zip
Hi there,
I think there must be some sort of hidden character (or the 'x000...' strings) that are causing issues when used as tip labels in the output tree which is causing this error. I renamed your taxa to be strings like a, b, c, etc. and the command was able to run. So I would try simplifying (and re-writing out the ids again to remove any hidden special characters) to address this problem.
Cheers,
Gavin
Thank you, Gavin! Really appreciate your help here. These files worked fine for other R and QIIME modules, so I was really lost. Thank you so much! Nate