Project build fails on windows
miroslavpojer opened this issue · comments
miroslavpojer commented
Describe the bug
Project build calls fail on windows machine
Code snippet that caused the issue
sbt ++2.13.9 assembly -DSPARK_VERSION=3.2.2
sbt ++2.13.9 test -DSPARK_VERSION=3.2.2
Expected behavior
Successful build.
Context
- Cobrix version: master
- Spark version: 3.2.2
- Scala version: 2.13.9
- Operating system: Windows 10
Output from assembly variation
C:\Users\ab024LL\absa\git\cobrix>sbt ++2.13.9 assembly -DSPARK_VERSION=3.2.2
[info] welcome to sbt 1.7.1 (Temurin Java 1.8.0_345)
[info] loading global plugins from C:\Users\ab024LL\.sbt\1.0\plugins
[info] loading settings for project cobrix-build from plugins.sbt ...
[info] loading project definition from C:\Users\ab024LL\absa\git\cobrix\project
[info] loading settings for project cobrix from build.sbt,publish.sbt,version.sbt ...
[info] set current project to cobrix (in build file:/C:/Users/ab024LL/absa/git/cobrix/)
[info] Setting Scala version to 2.13.9 on 4 projects.
[info] Reapplying settings...
[info] set current project to cobrix (in build file:/C:/Users/ab024LL/absa/git/cobrix/)
[info] Building with Spark 3.3.0, Scala 2.13.9
Scala 2.13.9 compiler options: -encoding UTF-8 -deprecation -unchecked -feature -explaintypes -opt:l:inline -opt-inline-from:<source> -opt-warnings -Ywarn-extra-implicit -Ywarn-numeric-widen -Ywarn-unused:implicits -Ywarn-unused:locals -Ywarn-unused:params -Ywarn-unused:patvars -Ywarn-unused:privates -Ywarn-value-discard -Xsource:2.13 -target:jvm-1.8
[info] Strategy 'discard' was applied to 4 files (Run the task at debug level to see details)
[info] Assembly up to date: C:\Users\ab024LL\absa\git\cobrix\cobol-parser\target\scala-2.13\cobol-parser-assembly-2.6.1-SNAPSHOT.jar
[info] Strategy 'discard' was applied to 4 files (Run the task at debug level to see details)
[warn] Ignored unknown package option FixedTimestamp(Some(1262304000000))
[success] Total time: 16 s, completed Nov 16, 2022 3:08:14 PM
[error] Expected symbol
[error] Not a valid command: -
[error] Expected end of input.
[error] Expected '--'
[error] Expected 'debug'
[error] Expected 'info'
[error] Expected 'warn'
[error] Expected 'error'
[error] Expected 'addPluginSbtFile'
[error] -DSPARK_VERSION=3.2.2
[error] ^
Output from test variation
[info] *** 13 SUITES ABORTED ***
[error] Error during tests:
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test2RecordOffsetsSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test10NonTerminalsSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test7FillersSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test15PathWithAsteriskSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test20InputFileNameSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test6TypeVarietySpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test1FixedLengthRecordsSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test24DebugModeSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test12MergeCopybooksSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test9CodePages
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test3SegmentFieldSpec
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test8NonPrintables
[error] za.co.absa.cobrix.spark.cobol.source.integration.Test16FixedLenSegmentRedefinesSpec
[error] (sparkCobol / Test / test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 104 s (01:44), completed Nov 16, 2022 3:12:25 PM
Ruslan Yushchenko commented
- EOL test issues are fixed by the PR: #530
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat
is a Hadoop Windows issue. You need to setup HADOOP_HOME and a couple of utils:
https://stackoverflow.com/questions/41851066/exception-in-thread-main-java-lang-unsatisfiedlinkerror-org-apache-hadoop-io- For
-DSPARK_VERSION=...
issue, it seems an sbt issue on Windows:
https://stackoverflow.com/questions/59144913/run-sbt-1-2-8-project-with-java-d-options-on-windows
You can work around it by using default Spark version for a given Scala version:sbt ++2.11.12 assembly sbt ++2.12.17 assembly sbt ++2.13.9 assembly
Ruslan Yushchenko commented
Due hadoop and sbt issues the only combination of Scala and Spark that worked for running tests on Windows without tricky workarounds was:
sbt ++2.11.12 test
miroslavpojer commented
I can confirm that sbt ++2.11.12 test
is running without changes.
To be able to run commands sbt ++2.12.17 test
and sbt ++2.13.9 test
I used Hadoop v3.2.4 and no more tests were aborted.
When I used Hadoop 2.10.2 - test were still aborting. (Noted in project readme to use 3.2.2+).
Ruslan Yushchenko commented
So it seems working? Can I close the issue?
miroslavpojer commented
So it seems working? Can I close the issue?
Yes