scalatest / scalatest-maven-plugin

ScalaTest Maven Plugin

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

"wildcardSuites" can't find tests, "suites" can

Azuaron opened this issue · comments

For full details, see this StackOverflow question.

In short, discovery was failing to find my tests using the wildcardSuites option:

<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>

By succeeds when using just the suites option:

<suites>com.cainc.data.etl.schema.proficiency.ProficiencySchemaITest</suites>

Versions:
scala.version=2.10.5
scalactic_2.10=2.2.6
scalatest_2.10=2.2.6
spark-testing-base_2.10=1.6.1_0.3.3
scala-maven-plugin=3.2.1
scalatest-maven-plugin=1.0

Relevant pom plugins:

        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-surefire-plugin</artifactId>
            <configuration>
                <skipTests>true</skipTests>
            </configuration>
        </plugin>
        <!-- enable scalatest -->
        <plugin>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest-maven-plugin</artifactId>
            <version>1.0</version>
            <configuration>
                <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
                <wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
            </configuration>
            <executions>
                <execution>
                    <id>test</id>
                    <goals>
                        <goal>test</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>

And here's the test class:

class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter {
    private var schemaStrategy: SchemaStrategy = _
    private var dataReader: DataFrameReader = _

    before {
        val sqlContext = new SQLContext(sc)
        import sqlContext._
        import sqlContext.implicits._

        val dataInReader = sqlContext.read.format("com.databricks.spark.csv")
                                            .option("header", "true")
                                            .option("nullValue", "")
        schemaStrategy = SchemaStrategyChooser("dim_state_test_proficiency")
        dataReader = schemaStrategy.applySchema(dataInReader)
    }

    "Proficiency Validation" should "pass with the CSV file proficiency-valid.csv" in {
        val dataIn = dataReader.load("src/test/resources/proficiency-valid.csv")

        val valid: Try[DataFrame] = Try(schemaStrategy.validateCsv(dataIn))
        valid match {
            case Success(v) => ()
            case Failure(e) => fail("Validation failed on what should have been a clean file: ", e)
        }
    }
}

When I run mvn test, it can't find any tests and outputs this message:

[INFO] --- scalatest-maven-plugin:1.0:test (test) @ load-csv-into-db ---
[36mDiscovery starting.[0m
[36mDiscovery completed in 54 milliseconds.[0m
[36mRun starting. Expected test count is: 0[0m
[32mDiscoverySuite:[0m
[36mRun completed in 133 milliseconds.[0m
[36mTotal number of tests run: 0[0m
[36mSuites: completed 1, aborted 0[0m
[36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m
[33mNo tests were executed.[0m

Do we have any fix on this?