Passing value_set through Evaluation Parameters crashes if list size is more than 100
BilalGGS opened this issue · comments
Dialect Used: Snowflake
Expectation Used : expect_column_values_to_be_in_set
Example of expectation:
"expectations": [
{
"expectation_type": "expect_column_values_to_be_in_set",
"kwargs": {
"column": "column_A",
"value_set": {
"$PARAMETER": "valid_ids"
}
},
"meta": {
"id": "test_unknown_ids"
}
}
],
Problem:
If the size of list passed through evaluation parameters(Not loading params from db, loading them from python list) to this expectation crosses 100, it breaks the underlying json.
Error Message:
sqlalchemy.exc.DataError: (psycopg2.errors.InvalidTextRepresentation) invalid input syntax for type json
LINE 1: ...rendered_task_instance_fields SET rendered_fields='{"run_nam...
^
DETAIL: Token "Infinity" is invalid.
CONTEXT: JSON data, line 1: ...6e5", 150641421, "e6096a21", "e6093099", Infinity...
Question:
Is it a bug that expectation crashes on list size more than 100 or is it desired behaviour?
I have verified that it only crashes when I pass 100+ size value_set through evaluation params, expectation works fine if I add the list of 100+ size directly into expectation json. So I think it must be something related to handling of big json objects by operator.