wgzhao / Addax

Addax is a versatile open-source ETL tool that can seamlessly transfer data between various RDBMS and NoSQL databases, making it an ideal solution for data migration.

Home Page:https://wgzhao.github.io/Addax/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Bug]: 执行脚本时出现 java.lang.NoClassDefFoundError

suiquantong opened this issue · comments

What happened?

A bug happened!
centos7.8
addax 4.1.4 版本

我是用 windows idea 进行打包然后放到 服务器执行 脚本的 脚本如下

`
==================== DEPRECATED WARNING ========================
addax.py is deprecated, It's going to be removed in future release.
As a replacement, you can use addax.sh to run job
==================== DEPRECATED WARNING ========================


/ _ \ | | | |
/ /\ \ __| | __| | __ ___ __
| _ |/ |/ _ |/ ` \ / /
| | | | (
| | (
| | (
| |> <
_| |/_,|_,|_,//_\

:: Addax version :: (v4.1.4-SNAPSHOT)

2024-01-08 11:39:53.491 [ main] INFO VMInfo - VMInfo# operatingSystem class => su n.management.OperatingSystemImpl
2024-01-08 11:39:53.505 [ main] INFO Engine -
{
"setting":{
"speed":{
"channel":1
},
"errorLimit":{
"record":0,
"percentage":0.02
}
},
"content":{
"reader":{
"name":"mysqlreader",
"parameter":{
"username":"ruoyi_flowable",
"password":"*****",
"column":[
"user_id",
"dept_id",
"user_name",
"nick_name",
"user_type",
"email",
"phonenumber",
"sex",
"avatar",
"password",
"status",
"del_flag",
"login_ip",
"login_date",
"create_by",
"create_time",
"update_by",
"update_time",
"remark"
],
"splitPk":"",
"connection":[
{
"table":[
"sys_user"
],
"jdbcUrl":[
"jdbc:mysql://172.18.1.126:3306/ruoyi_flowable? useUnicode=true&characterEncoding=utf-8&useSSL=false&rewriteBatchedStatements=true"
]
}
]
}
},
"writer":{
"name":"hdfswriter",
"parameter":{
"defaultFS":"hdfs://172.18.1.82:8020",
"fileType":"text",
"path":"/data/servers/hive/warehouse/test.db/sys_user_7lnlbq",
"fileName":"sys_user_7lnlbq",
"writeMode":"append",
"fieldDelimiter":",",
"column":[
{
"name":"user_id",
"type":"bigint"
},
{
"name":"dept_id",
"type":"bigint"
},
{
"name":"user_name",
"type":"string"
},
{
"name":"nick_name",
"type":"string"
},
{
"name":"user_type",
"type":"string"
},
{
"name":"email",
"type":"string"
},
{
"name":"phonenumber",
"type":"string"
},
{
"name":"sex",
"type":"string"
},
{
"name":"avatar",
"type":"string"
},
{
"name":"password",
"type":"string"
},
{
"name":"status",
"type":"string"
},
{
"name":"del_flag",
"type":"string"
},
{
"name":"login_ip",
"type":"string"
},
{
"name":"login_date",
"type":"timestamp"
},
{
"name":"create_by",
"type":"string"
},
{
"name":"create_time",
"type":"timestamp"
},
{
"name":"update_by",
"type":"string"
},
{
"name":"update_time",
"type":"timestamp"
},
{
"name":"remark",
"type":"string"
}
]
}
}
}
}

2024-01-08 11:39:53.522 [ main] INFO JobContainer - The jobContainer begins to process the job.
2024-01-08 11:39:53.764 [ job-0] INFO OriginalConfPretreatmentUtil - Available jdbcUrl [jdbc:mys ql://172.18.1.126:3306/ruoyi_flowable?useUnicode=true&characterEncoding=utf-8&useSSL=false&rewriteBatch edStatements=true&yearIsDateType=false&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&rewriteBa tchedStatements=true&serverTimezone=GMT%2B8&useSSL=false].
2024-01-08 11:39:53.794 [ job-0] INFO OriginalConfPretreatmentUtil - The table [sys_user] has co lumns [user_id,dept_id,user_name,nick_name,user_type,email,phonenumber,sex,avatar,password,status,del_f lag,login_ip,login_date,create_by,create_time,update_by,update_time,remark].
java.lang.NoClassDefFoundError: org/apache/logging/log4j/spi/LoggerAdapter
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.commons.logging.LogFactory.createFactory(LogFactory.java:419)
at org.apache.commons.logging.LogFactory.lambda$newFactory$3(LogFactory.java:1432)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.commons.logging.LogFactory.newFactory(LogFactory.java:1431)
at org.apache.commons.logging.LogFactory.getFactory(LogFactory.java:928)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:987)
at org.apache.hadoop.fs.FileSystem.(FileSystem.java:137)
at com.wgzhao.addax.plugin.writer.hdfswriter.HdfsHelper.getFileSystem(HdfsHelper.java:326)
at com.wgzhao.addax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:80)
at com.wgzhao.addax.core.job.JobContainer.initJobWriter(JobContainer.java:610)
at com.wgzhao.addax.core.job.JobContainer.init(JobContainer.java:266)
at com.wgzhao.addax.core.job.JobContainer.start(JobContainer.java:122)
at com.wgzhao.addax.core.Engine.start(Engine.java:61)
at com.wgzhao.addax.core.Engine.entry(Engine.java:112)
at com.wgzhao.addax.core.Engine.main(Engine.java:139)
Caused by: java.lang.ClassNotFoundException: org.apache.logging.log4j.spi.LoggerAdapter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
2024-01-08 11:39:53.987 [ job-0] ERROR Engine - java.lang.NoClassDefFoundError: org /apache/logging/log4j/spi/LoggerAdapter

`
出现了报错,初步怀疑是依赖冲突 并且将 addax-core 中的commons-logging 依赖全部 排除 确保版本为1.3.0 但是发到服务器上还是有这个报错

Version

4.1.3 (Default)

OS Type

Linux (Default)

Java JDK Version

Oracle JDK 1.8.0

Relevant log output

java.lang.NoClassDefFoundError: org/apache/logging/log4j/spi/LoggerAdapter
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.commons.logging.LogFactory.createFactory(LogFactory.java:419)
        at org.apache.commons.logging.LogFactory.lambda$newFactory$3(LogFactory.java:1432)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.commons.logging.LogFactory.newFactory(LogFactory.java:1431)
        at org.apache.commons.logging.LogFactory.getFactory(LogFactory.java:928)
        at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:987)
        at org.apache.hadoop.fs.FileSystem.<clinit>(FileSystem.java:137)
        at com.wgzhao.addax.plugin.writer.hdfswriter.HdfsHelper.getFileSystem(HdfsHelper.java:326)
        at com.wgzhao.addax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:80)
        at com.wgzhao.addax.core.job.JobContainer.initJobWriter(JobContainer.java:610)
        at com.wgzhao.addax.core.job.JobContainer.init(JobContainer.java:266)
        at com.wgzhao.addax.core.job.JobContainer.start(JobContainer.java:122)
        at com.wgzhao.addax.core.Engine.start(Engine.java:61)
        at com.wgzhao.addax.core.Engine.entry(Engine.java:112)
        at com.wgzhao.addax.core.Engine.main(Engine.java:139)
[root@master bin]# python /data/servers/addax/bin/addax.py /data/servers/addax/job/yawei_1f112832bbb739                                          d30e9751e16c09a034.json
==================== DEPRECATED WARNING ========================
addax.py is deprecated, It's going to be removed in future release.
As a replacement, you can use addax.sh to run job
==================== DEPRECATED WARNING ========================


  ___      _     _
 / _ \    | |   | |
/ /_\ \ __| | __| | __ ___  __
|  _  |/ _` |/ _` |/ _` \ \/ /
| | | | (_| | (_| | (_| |>  <
\_| |_/\__,_|\__,_|\__,_/_/\_\

:: Addax version ::    (v4.1.4-SNAPSHOT)

2024-01-08 11:39:53.491 [        main] INFO  VMInfo               - VMInfo# operatingSystem class => su                                          n.management.OperatingSystemImpl
2024-01-08 11:39:53.505 [        main] INFO  Engine               -
{
        "setting":{
                "speed":{
                        "channel":1
                },
                "errorLimit":{
                        "record":0,
                        "percentage":0.02
                }
        },
        "content":{
                "reader":{
                        "name":"mysqlreader",
                        "parameter":{
                                "username":"ruoyi_flowable",
                                "password":"*****",
                                "column":[
                                        "`user_id`",
                                        "`dept_id`",
                                        "`user_name`",
                                        "`nick_name`",
                                        "`user_type`",
                                        "`email`",
                                        "`phonenumber`",
                                        "`sex`",
                                        "`avatar`",
                                        "`password`",
                                        "`status`",
                                        "`del_flag`",
                                        "`login_ip`",
                                        "`login_date`",
                                        "`create_by`",
                                        "`create_time`",
                                        "`update_by`",
                                        "`update_time`",
                                        "`remark`"
                                ],
                                "splitPk":"",
                                "connection":[
                                        {
                                                "table":[
                                                        "sys_user"
                                                ],
                                                "jdbcUrl":[
                                                        "jdbc:mysql://172.18.1.126:3306/ruoyi_flowable?                                          useUnicode=true&characterEncoding=utf-8&useSSL=false&rewriteBatchedStatements=true"
                                                ]
                                        }
                                ]
                        }
                },
                "writer":{
                        "name":"hdfswriter",
                        "parameter":{
                                "defaultFS":"hdfs://172.18.1.82:8020",
                                "fileType":"text",
                                "path":"/data/servers/hive/warehouse/test.db/sys_user_7lnlbq",
                                "fileName":"sys_user_7lnlbq",
                                "writeMode":"append",
                                "fieldDelimiter":",",
                                "column":[
                                        {
                                                "name":"user_id",
                                                "type":"bigint"
                                        },
                                        {
                                                "name":"dept_id",
                                                "type":"bigint"
                                        },
                                        {
                                                "name":"user_name",
                                                "type":"string"
                                        },
                                        {
                                                "name":"nick_name",
                                                "type":"string"
                                        },
                                        {
                                                "name":"user_type",
                                                "type":"string"
                                        },
                                        {
                                                "name":"email",
                                                "type":"string"
                                        },
                                        {
                                                "name":"phonenumber",
                                                "type":"string"
                                        },
                                        {
                                                "name":"sex",
                                                "type":"string"
                                        },
                                        {
                                                "name":"avatar",
                                                "type":"string"
                                        },
                                        {
                                                "name":"password",
                                                "type":"string"
                                        },
                                        {
                                                "name":"status",
                                                "type":"string"
                                        },
                                        {
                                                "name":"del_flag",
                                                "type":"string"
                                        },
                                        {
                                                "name":"login_ip",
                                                "type":"string"
                                        },
                                        {
                                                "name":"login_date",
                                                "type":"timestamp"
                                        },
                                        {
                                                "name":"create_by",
                                                "type":"string"
                                        },
                                        {
                                                "name":"create_time",
                                                "type":"timestamp"
                                        },
                                        {
                                                "name":"update_by",
                                                "type":"string"
                                        },
                                        {
                                                "name":"update_time",
                                                "type":"timestamp"
                                        },
                                        {
                                                "name":"remark",
                                                "type":"string"
                                        }
                                ]
                        }
                }
        }
}

2024-01-08 11:39:53.522 [        main] INFO  JobContainer         - The jobContainer begins to process                                           the job.
2024-01-08 11:39:53.764 [       job-0] INFO  OriginalConfPretreatmentUtil - Available jdbcUrl [jdbc:mys                                          ql://172.18.1.126:3306/ruoyi_flowable?useUnicode=true&characterEncoding=utf-8&useSSL=false&rewriteBatch                                          edStatements=true&yearIsDateType=false&zeroDateTimeBehavior=convertToNull&tinyInt1isBit=false&rewriteBa                                          tchedStatements=true&serverTimezone=GMT%2B8&useSSL=false].
2024-01-08 11:39:53.794 [       job-0] INFO  OriginalConfPretreatmentUtil - The table [sys_user] has co                                          lumns [user_id,dept_id,user_name,nick_name,user_type,email,phonenumber,sex,avatar,password,status,del_f                                          lag,login_ip,login_date,create_by,create_time,update_by,update_time,remark].
java.lang.NoClassDefFoundError: org/apache/logging/log4j/spi/LoggerAdapter
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.commons.logging.LogFactory.createFactory(LogFactory.java:419)
        at org.apache.commons.logging.LogFactory.lambda$newFactory$3(LogFactory.java:1432)
        at java.security.AccessController.doPrivileged(Native Method)
        at org.apache.commons.logging.LogFactory.newFactory(LogFactory.java:1431)
        at org.apache.commons.logging.LogFactory.getFactory(LogFactory.java:928)
        at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:987)
        at org.apache.hadoop.fs.FileSystem.<clinit>(FileSystem.java:137)
        at com.wgzhao.addax.plugin.writer.hdfswriter.HdfsHelper.getFileSystem(HdfsHelper.java:326)
        at com.wgzhao.addax.plugin.writer.hdfswriter.HdfsWriter$Job.init(HdfsWriter.java:80)
        at com.wgzhao.addax.core.job.JobContainer.initJobWriter(JobContainer.java:610)
        at com.wgzhao.addax.core.job.JobContainer.init(JobContainer.java:266)
        at com.wgzhao.addax.core.job.JobContainer.start(JobContainer.java:122)
        at com.wgzhao.addax.core.Engine.start(Engine.java:61)
        at com.wgzhao.addax.core.Engine.entry(Engine.java:112)
        at com.wgzhao.addax.core.Engine.main(Engine.java:139)
Caused by: java.lang.ClassNotFoundException: org.apache.logging.log4j.spi.LoggerAdapter
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 17 more
2024-01-08 11:39:53.987 [       job-0] ERROR Engine               - java.lang.NoClassDefFoundError: org                                          /apache/logging/log4j/spi/LoggerAdapter

可以看下 plugin/writer/hdfswriter/libs/目录下是否有 log4j-api-<version>.jar 的软链接,如果没有,尝试拷贝一个比如 log4j-api-2.17.1.jar 到上述目录看看。

hdfswriter 插件打包时,是会将 log4j-api-<version>.jar 文件打包进去的