OutOfMemoryError
hennihaus opened this issue · comments
Hello together,
as the related issue #76 was not updated since 2 two years I open now a new issue.
When I try to map a gtfs zip file with a size of 30MB I get this error:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.base/java.util.regex.Matcher.<init>(Matcher.java:249)
at java.base/java.util.regex.Pattern.matcher(Pattern.java:1133)
at org.onebusaway.gtfs.serialization.mappings.StopTimeFieldMappingFactory.getStringAsSeconds(StopTimeFieldMappingFactory.java:70)
at org.onebusaway.gtfs.serialization.mappings.StopTimeFieldMappingFactory$StopTimeFieldMapping.convert(StopTimeFieldMappingFactory.java:123)
at org.onebusaway.gtfs.serialization.mappings.StopTimeFieldMappingFactory$StopTimeFieldMapping.translateFromCSVToObject(StopTimeFieldMappingFactory.java:100)
at org.onebusaway.csv_entities.IndividualCsvEntityReader.readEntity(IndividualCsvEntityReader.java:131)
at org.onebusaway.csv_entities.IndividualCsvEntityReader.handleLine(IndividualCsvEntityReader.java:98)
at org.onebusaway.csv_entities.CsvEntityReader.readEntities(CsvEntityReader.java:157)
at org.onebusaway.csv_entities.CsvEntityReader.readEntities(CsvEntityReader.java:120)
at org.onebusaway.csv_entities.CsvEntityReader.readEntities(CsvEntityReader.java:115)
at org.onebusaway.gtfs.serialization.GtfsReader.run(GtfsReader.java:171)
at org.onebusaway.gtfs.serialization.GtfsReader.run(GtfsReader.java:159)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceImpl.extractStopsFrom(GtfsApiServiceImpl.java:65)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceImpl.executeApiCall(GtfsApiServiceImpl.java:49)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceImpl$$Lambda$351/0x0000000100410040.apply(Unknown Source)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.onNext(FluxFlatMap.java:378)
at reactor.core.publisher.FluxIterable$IterableSubscription.slowPath(FluxIterable.java:267)
at reactor.core.publisher.FluxIterable$IterableSubscription.request(FluxIterable.java:225)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.onSubscribe(FluxFlatMap.java:363)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:161)
at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:86)
at reactor.core.publisher.Flux.subscribe(Flux.java:8325)
at reactor.test.DefaultStepVerifierBuilder$DefaultStepVerifier.toVerifierAndSubscribe(DefaultStepVerifierBuilder.java:868)
at reactor.test.DefaultStepVerifierBuilder$DefaultStepVerifier.verify(DefaultStepVerifierBuilder.java:824)
at reactor.test.DefaultStepVerifierBuilder$DefaultStepVerifier.verify(DefaultStepVerifierBuilder.java:816)
at reactor.test.DefaultStepVerifierBuilder.verifyComplete(DefaultStepVerifierBuilder.java:683)
at de.blackforestsolutions.dravelopsstationimporter.service.communicationservice.GtfsApiServiceTest.test_(GtfsApiServiceTest.java:65)
This is the code where the error happens:
private Collection<Stop> extractStopsFrom(File gtfsZip) throws IOException {
GtfsReader gtfsReader = new GtfsReader();
// set the input file
gtfsReader.setInputLocation(gtfsZip);
// starts to read the file
GtfsDaoImpl store = new GtfsDaoImpl();
gtfsReader.setEntityStore(store);
gtfsReader.run();
return store.getAllStops();
}
I use a Maven project with Java version 11.
How much memory do I need to fix this error? Am I doing something wrong?
@hennihaus Try using the -Xmx256m
command line parameter to increase your Java heap size.
See https://stackoverflow.com/questions/6452765/how-to-increase-heap-size-of-jvm
I fixed this error by modifying the -Xmx command line paramter in pom:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
<configuration>
<junitArtifactName>junit:junit</junitArtifactName>
<encoding>${project.build.sourceEncoding}</encoding>
<argLine>-Xms256m -Xmx1G -XX:MaxPermSize=512m -ea
-Dfile.encoding=UTF-8
</argLine>
</configuration>
</plugin>
Thanks for your help immediately when this is not a bug of your repo.