[Bug] [Connector-V2] JDBC-SOURCE: SAP DBTech JDBC: Cannot convert Java type java.math.BigDecimal to SQL type DATE
dik111 opened this issue · comments
Search before asking
- I had searched in the issues and found no similar issues.
What happened
When I try to use seatunnel to synchronize SAP hana data to doris, when partition_column is of date type, an error will be reported.
I checked the code and found that the split value will be converted into timestamp, for example: 2024-02-06 07:12:00
is converted to 1707174720003
, then the (param instanceof BigDecimal) judgment will be entered in the createNumberColumnSplitStatement
method. I think we should first determine whether the value is a timestamp. If it is a timestamp, then statement.setTimestamp
// here is the source code
else if (param instanceof BigDecimal) {
statement.setBigDecimal(i + 1, (BigDecimal) param);
}
SeaTunnel Version
2.3.5
SeaTunnel Config
env {
parallelism = 8
job.mode = "BATCH"
flink.taskmanager.memory.process.size= "8G"
flink.taskmanager.memory.framework.off-heap.size = "1G"
}
source {
Jdbc {
url = "jdbc:sap://xxx?reconnect=true"
driver = "com.sap.db.jdbc.Driver"
user = "xx"
password = "xx"
fetch_size = 10000
query = "SELECT ACCOUNT_NUMBER,CREATION_DATE,ORG_ID FROM xxx WHERE CREATION_DATE between TO_DATE('2024-01') and TO_DATE('2024-05')"
partition_column = "CREATION_DATE"
}
}
sink {
Doris {
fenodes = "xx:8030"
username = root
password = "xx"
database = "model"
table = "xx"
data_save_mode = "DROP_DATA"
doris.config ={
format = "json"
read_json_by_line = "true"
columns = "ACCOUNT_NUMBER,CREATION_DATE,ORG_ID"
}
}
}
Running Command
/data/software/seatunnel/seatunnel-2.3.5/bin/start-seatunnel-flink-15-connector-v2.sh \
--master yarn-per-job \
--config /data/software/seatunnel/seatunnel-2.3.5/config/hana-doris-flink.cnf
Error Exception
org.apache.seatunnel.connectors.seatunnel.jdbc.exception.JdbcConnectorException: ErrorCode:[JDBC-04], ErrorDescription:[Connector database failed] - open() failed.SAP DBTech JDBC: Cannot convert Java type java.math.BigDecimal to SQL type DATE.
at org.apache.seatunnel.connectors.seatunnel.jdbc.internal.JdbcInputFormat.open(JdbcInputFormat.java:104) ~[connector-jdbc-2.3.5.jar:2.3.5]
at org.apache.seatunnel.connectors.seatunnel.jdbc.source.JdbcSourceReader.pollNext(JdbcSourceReader.java:67) ~[connector-jdbc-2.3.5.jar:2.3.5]
at org.apache.seatunnel.translation.flink.source.FlinkSourceReader.pollNext(FlinkSourceReader.java:81) ~[seatunnel-flink-15-starter.jar:2.3.5]
at org.apache.flink.streaming.api.operators.SourceOperator.emitNext(SourceOperator.java:385) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.streaming.runtime.io.StreamTaskSourceInput.emitNext(StreamTaskSourceInput.java:68) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:542) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:231) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:831) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:780) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:935) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:914) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:728) ~[flink-dist-1.16.1.jar:1.16.1]
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:550) ~[flink-dist-1.16.1.jar:1.16.1]
at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_381]
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC: Cannot convert Java type java.math.BigDecimal to SQL type DATE.
at com.sap.db.jdbc.exceptions.SQLExceptionSapDB._newInstance(SQLExceptionSapDB.java:215) ~[ngdbc-2.20.11.jar:2.20.11-354e45de0dcbabd02f58c506ecf1138161ee4b1e]
at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.newInstance(SQLExceptionSapDB.java:26) ~[ngdbc-2.20.11.jar:2.20.11-354e45de0dcbabd02f58c506ecf1138161ee4b1e]
at com.sap.db.jdbc.converters.AbstractConverter._newSetException(AbstractConverter.java:868) ~[ngdbc-2.20.11.jar:2.20.11-354e45de0dcbabd02f58c506ecf1138161ee4b1e]
at com.sap.db.jdbc.converters.AbstractConverter.setBigDecimal(AbstractConverter.java:644) ~[ngdbc-2.20.11.jar:2.20.11-354e45de0dcbabd02f58c506ecf1138161ee4b1e]
at com.sap.db.jdbc.PreparedStatementSapDB._setBigDecimal(PreparedStatementSapDB.java:2958) ~[ngdbc-2.20.11.jar:2.20.11-354e45de0dcbabd02f58c506ecf1138161ee4b1e]
at com.sap.db.jdbc.PreparedStatementSapDB.setBigDecimal(PreparedStatementSapDB.java:1118) ~[ngdbc-2.20.11.jar:2.20.11-354e45de0dcbabd02f58c506ecf1138161ee4b1e]
at org.apache.seatunnel.connectors.seatunnel.jdbc.source.FixedChunkSplitter.createNumberColumnSplitStatement(FixedChunkSplitter.java:207) ~[connector-jdbc-2.3.5.jar:2.3.5]
at org.apache.seatunnel.connectors.seatunnel.jdbc.source.FixedChunkSplitter.createSplitStatement(FixedChunkSplitter.java:100) ~[connector-jdbc-2.3.5.jar:2.3.5]
at org.apache.seatunnel.connectors.seatunnel.jdbc.source.ChunkSplitter.generateSplitStatement(ChunkSplitter.java:120) ~[connector-jdbc-2.3.5.jar:2.3.5]
at org.apache.seatunnel.connectors.seatunnel.jdbc.internal.JdbcInputFormat.open(JdbcInputFormat.java:98) ~[connector-jdbc-2.3.5.jar:2.3.5]
... 14 more
Zeta or Flink or Spark Version
FLINK 1.16.1
Java or Scala Version
java 8
scala 2.12
Screenshots
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct
This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.
This issue has been closed because it has not received response for too long time. You could reopen it if you encountered similar problems in the future.