AbsaOSS / cobrix

A COBOL parser and Mainframe/EBCDIC data source for Apache Spark

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Allow specifying expressions for record size

yruslan opened this issue · comments

Background

This is an idea that instead of defining a record length column it sometimes could be useful to define a record length expression based on a column ore more than one column.

Feature

A description of the requested feature.

Proposed Solution

      01  R.
                03 RECORD-LEN        PIC 9(3).
                03 FIELDS                    PIC X(100).
      val df = spark.read
        .format("cobol")
        .option("copybook_contents", copyBook)
        .option("record_format", "F")
        .option("record_length_expresion", "500 + @RECORD_LEN * 10")
        .load(filePath)