epacke / logstash-pipeline-tester

Tool for testing logstash pipelines

Home Page:https://loadbalancing.se/2020/03/11/logstash-pipeline-tester/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

backend is never active

anubisg1 opened this issue · comments

As per title, i build the container and started it.. the docker logs are as follow.
what i noticed is that the logstash container does log a lot of data (and works) but the config_tester (where the backend should reside) has very little logs for some reason

admin@localhost:~/logstash-pipeline-tester$ sudo docker-compose up
Creating network "logstash-pipeline-tester_default" with the default driver
Creating config_tester ... done
Creating logstash      ... done
Attaching to logstash, config_tester
logstash       | 2022-10-05 10:49:33,022 CRIT Supervisor is running as root.  Privileges were not dropped because no user is specified in the config file.  If you intend to run as root, you can set user=root in the config file to avoid this message.
logstash       | 2022-10-05 10:49:33,023 INFO supervisord started with pid 7
logstash       | 2022-10-05 10:49:34,026 INFO spawned: 'logstash' with pid 9
logstash       | Using bundled JDK: /usr/share/logstash/jdk
logstash       | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
config_tester  | Logstash config tester running on port 8080!
logstash       | 2022-10-05 10:49:35,183 INFO success: logstash entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
logstash       | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash       | [2022-10-05T10:49:48,318][INFO ][logstash.runner          ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
logstash       | [2022-10-05T10:49:48,328][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.1.1", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14.1+1 on 11.0.14.1+1 +indy +jit [linux-x86_64]"}
logstash       | [2022-10-05T10:49:48,329][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
logstash       | [2022-10-05T10:49:48,363][INFO ][logstash.settings        ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
logstash       | [2022-10-05T10:49:48,372][INFO ][logstash.settings        ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
logstash       | [2022-10-05T10:49:48,618][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"deb0ad16-29dd-4b2c-bfb7-ed1bf0b1ebeb", :path=>"/usr/share/logstash/data/uuid"}
logstash       | [2022-10-05T10:49:49,744][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
logstash       | [2022-10-05T10:49:50,702][INFO ][org.reflections.Reflections] Reflections took 125 ms to scan 1 urls, producing 120 keys and 417 values 
logstash       | [2022-10-05T10:49:51,112][INFO ][logstash.javapipeline    ] Pipeline `generic-json` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
logstash       | [2022-10-05T10:49:51,262][INFO ][logstash.filters.json    ][generic-json] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
logstash       | [2022-10-05T10:49:51,387][INFO ][logstash.javapipeline    ][generic-json] Starting pipeline {:pipeline_id=>"generic-json", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/pipeline/generic-json/generic-json.conf"], :thread=>"#<Thread:0x691f28e1@/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52 run>"}
logstash       | [2022-10-05T10:49:52,428][WARN ][logstash.filters.translate] You are using a deprecated config setting "destination" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `target` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"destination", :plugin=><LogStash::Filters::Translate dictionary_path=>"/usr/share/logstash/dictionaries/f5-syslogpriorities.yml", destination=>"syslog_severity", id=>"6859d983e0536f7c2f340f9fbfc7353f35cf8b081084aa213d8e5260bceac4b5", field=>"syslog_pri", enable_metric=>true, periodic_flush=>false, refresh_interval=>300, exact=>true, regex=>false, refresh_behaviour=>"merge">}
logstash       | [2022-10-05T10:49:52,435][WARN ][logstash.filters.translate] You are using a deprecated config setting "field" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `source` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"field", :plugin=><LogStash::Filters::Translate dictionary_path=>"/usr/share/logstash/dictionaries/f5-syslogpriorities.yml", destination=>"syslog_severity", id=>"6859d983e0536f7c2f340f9fbfc7353f35cf8b081084aa213d8e5260bceac4b5", field=>"syslog_pri", enable_metric=>true, periodic_flush=>false, refresh_interval=>300, exact=>true, regex=>false, refresh_behaviour=>"merge">}
logstash       | [2022-10-05T10:49:52,554][INFO ][logstash.javapipeline    ] Pipeline `f5-syslog` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
logstash       | [2022-10-05T10:49:52,581][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:52,684][INFO ][logstash.javapipeline    ][generic-json] Pipeline Java execution initialization time {"seconds"=>1.29}
logstash       | [2022-10-05T10:49:52,821][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:52,856][INFO ][logstash.javapipeline    ][generic-json] Pipeline started {"pipeline.id"=>"generic-json"}
logstash       | [2022-10-05T10:49:52,886][INFO ][logstash.inputs.tcp      ][generic-json][278db2c56d400fd170f30212a79c4fb3b065952853456a2b545800c10c8fb383] Starting tcp input listener {:address=>"0.0.0.0:5060", :ssl_enable=>false}
logstash       | [2022-10-05T10:49:52,961][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:52,998][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:53,052][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:53,133][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:53,151][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:53,174][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:53,345][WARN ][deprecation.logstash.filters.translate][f5-syslog] `field` option is deprecated; use `source` instead.
logstash       | [2022-10-05T10:49:53,346][WARN ][deprecation.logstash.filters.translate][f5-syslog] `destination` option is deprecated; use `target` instead.
logstash       | [2022-10-05T10:49:53,352][WARN ][logstash.filters.grok    ][f5-syslog] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
logstash       | [2022-10-05T10:49:53,394][INFO ][logstash.javapipeline    ][f5-syslog] Starting pipeline {:pipeline_id=>"f5-syslog", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/pipeline/f5-syslog/f5-syslog.conf"], :thread=>"#<Thread:0x5615cd96 run>"}
logstash       | [2022-10-05T10:49:53,744][INFO ][logstash.javapipeline    ][f5-syslog] Pipeline Java execution initialization time {"seconds"=>0.35}
logstash       | [2022-10-05T10:49:53,756][INFO ][logstash.javapipeline    ][f5-syslog] Pipeline started {"pipeline.id"=>"f5-syslog"}
logstash       | [2022-10-05T10:49:53,764][INFO ][logstash.inputs.tcp      ][f5-syslog][534d75dc24db2f80817716772325d9e8b06a966713eed26c637a76ead6fac9d3] Starting tcp input listener {:address=>"0.0.0.0:5245", :ssl_enable=>false}
logstash       | [2022-10-05T10:49:53,800][INFO ][logstash.agent           ] Pipelines running {:count=>2, :running_pipelines=>[:"generic-json", :"f5-syslog"], :non_running_pipelines=>[]}
logstash       | [2022-10-05T10:49:53,859][INFO ][logstash.inputs.udp      ][f5-syslog][6b1a3bcafd6af04a218dc8bc08fd769694c78e2f607867cdc1164826a91ee915] Starting UDP listener {:address=>"0.0.0.0:5245"}
logstash       | [2022-10-05T10:49:53,882][INFO ][logstash.inputs.udp      ][f5-syslog][6b1a3bcafd6af04a218dc8bc08fd769694c78e2f607867cdc1164826a91ee915] UDP listener started {:address=>"0.0.0.0:5245", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}

looking at the network calls, i see only this one is being sent

http://192.168.147.137:8080/api/v1/logstashStatus

which returns status code 304 with the following payload

{"logstashAPI":true,"pipelines":["f5-syslog","generic-json"]}

i don't see any api towards a backend

i think that the backend fails because the websocket attempts to connect to localhost

image

the issue is definitely here

ws = await new WebSocket('ws://localhost:8080/api/v1/getLogstashOutput');

localhost is hardcoded , preparing a PR