Exception when reading Decimal types written by connector
maver1ck opened this issue · comments
Maciej Bryński commented
I'm using Kafka Connect HDFS Connector to write Avro topics with Decimal fields.
I'm using integration with Hive.
When reading data I'm getting following exception:
Error: java.io.IOException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: org.apache.hadoop.hive.serde2.io.HiveDecimalWritable cannot be cast to org.apache.hadoop.io.BytesWritable (state=,code=0)
I think the reason of the problem is that connector is creating fields with binary data_type in Hive (instead of Decimal)
+---------------------------------+------------+----------+
| col_name | data_type | comment |
+---------------------------------+------------+----------+
| abc | binary | |