OPM / opm-common

Common components for OPM, in particular build system (cmake).

Home Page:http://www.opm-project.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

why AQUFLUX triggers extra `/` parsing error while AQUANCON will not

GitPaean opened this issue · comments

The following keywords are from the DATA file AQUFLUX-01.DATA in opm-tests.

AQUFLUX
 1 1.0 /
 2 1.0 /
/

AQUANCON
-- Aq#     I1 I2  J1   J2  K1 K2 FACE
  1     1   2    1   1   1   5  I-    0.8         1*         NO/
  2     19  20   1   1   1   5  I+    1.         1*         NO/
/

When using opmi to parse this file, the following error was reported for the keyword AQUFLUX

In AQUFLUX-01_ECL.DATA line 189
Extra '/' detected in AQUFLUX-01_ECL.DATA line 189

By removing the extra /, it continues parsing til the SUMMARY section.

My question is why AQUFLUX can not have the last extra / while AQUANCON can. What is the difference between the two keywords in term of setup? Maybe AQUFLUX is a SCHEDULE keyword while AQUANCON is not?

Thanks.

My question is why AQUFLUX can not have the last extra / while AQUANCON can.

It's because AQUFLUX is defined/expected to have exactly NANAQU (i.e., AQUDIMS(5)) records. That's probably wrong, and it should be at most NANAQU records instead. Whenever we have a keyword with a fixed number of records, e.g., PVTW, our parser will flag a terminating slash as an error.

This is effectively yet another instance of failing to properly capture the structure of aquifer-related keywords. Another case is the old issue #1130.

Thanks for the response. I think I should probably remove size definition in the definition of keyword AQUFLUX?

  "size": {
    "keyword": "AQUDIMS",
    "item": "NANAQU"
  },

I think I should probably remove size definition in the definition of keyword AQUFLUX?

Yes, I think that'd be the correct course of action here. That would switch the keyword into a "variable number of records" type keyword and would therefore also switch to requiring the terminating / character.

Thanks @bska for the help. The issue has been addressed. Closing it now.