dataset.upload fails on s3. Failed uploading: 'LazyEvalWrapper'.
nkgrush opened this issue · comments
Describe the bug
dataset.upload fails on a medium size .csv to s3. Works fine with a small test .txt, the s3 bucket is fine itself. Also the error message is a bit cryptic:
Uploading dataset changes (1 files compressed to 173.31 MiB) to s3://server/bucket
2023-07-23 16:49:54,715 - clearml.storage - ERROR - Failed uploading: 'LazyEvalWrapper' object cannot be interpreted as an integer
2023-07-23 16:49:55,201 - clearml.storage - ERROR - Failed uploading: 'LazyEvalWrapper' object cannot be interpreted as an integer
2023-07-23 16:49:55,738 - clearml.storage - ERROR - Failed uploading: 'LazyEvalWrapper' object cannot be interpreted as an integer
2023-07-23 16:49:55,766 - clearml.storage - ERROR - Exception encountered while uploading Upload failed
To reproduce
task = Task.init(
project_name=project,
task_name=name,
reuse_last_task_id=False,
)
# to get it going on colab without clearml.conf
task.setup_aws_upload(
bucket=s3bucket,
host=s3endpoint,
key=s3key,
secret=s3secret,
secure=s3secure
)
...
dataset.add_files('large.csv')
dataset.upload()
Expected behaviour
The file should get uploaded.
Environment
- app.clear.ml
- clearml 1.12.0
- Python 3.10.6
- OS: Linux (colab)
Thanks for reporting @nkgrush.
A fix should be coming out later this week.
@ainoam is this issue a regression? Is there an older version for self-hosted clearml that doesn't have this issue?
Thanks!
Erik
@ErikGartner clearml v1.10.0 has reportedly not shown this issue in similar environment.
Hi @nkgrush, @ErikGartner, please try out ClearML SDK v1.12.1rc0 and let us know if it works for you.
Hi @jkhenning @ainoam, the patch seems to have resolved my issues.
Thanks!
Thanks! The commit resolved my issues