splunk / splunk-ansible

Ansible playbooks for configuring and managing Splunk Enterprise and Universal Forwarder deployments

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Distributing 'local' app configurations

svanschie opened this issue · comments

I'm having a bit of an issue with distributing configurations of an app which are set by it's setup page (in this case the Opsgenie app https://splunkbase.splunk.com/app/3759/).

I can access the setup page on the deployer node and fill out the API key it needs. This key is stored in local/passwords.conf, and as the default state of the app is 'disabled', it also stores some configuration in local/app.conf to enable the app.

Due to the changes in #416, this configuration is not being copied to the search head cluster members. I'm not sure how to properly resolve this. If I understood correctly, it's against Splunk best practices to use the local app, but what would be the proper way of setting up for example credentials for an app? According to https://dev.splunk.com/enterprise/docs/developapps/setuppage/ the setup page seems to still be the way to go for this.

To work around the issue I can configure the app using configuration management tools, or create my own app (per cluster) with the API key in there. This is something I'd however really rather not do.

The only somewhat proper solution I see for this is to use the default.yml way to set the desired local configurations on every node, but this still doesn't feel entirely right.

Any suggestions as to what would be the best way to resolve this issue?

I guess there's always some exceptions to the original line of thinking in that PR :)

I hadn't considered apps that had interactive configurations. I wonder if this is something that can be done on the SHC captain/members after it has been initialized? As in, the default OpsGenie app gets shipped from deployer to the SH nodes, and the interactive portion of the configuration is done on the SH itself. Theoretically, knowledge objects and configurations should be replicated by the clustering, although I'm not certain if that behavior holds with this particular app or passwords.conf.

If that doesn't work, let me play around with some alternatives. I was recently made aware that the deployer has a new option specifying push mode and it's been requested to make this a native setup option in these Ansible plays.

At least in this case, configurations done on the setup page on the deployer, are not replicated to the SH. I think the local configuration is only synchronised upon a 'apply shcluster-bundle'?

Unfortunately I can't seem to configure the app directly on the SH. On the deployer there's a 'setup' button on the apps page, but not on the SH itself. I'm not sure how this works and why the setup page doesn't show up there.

The push mode seems interesting, for my situation this should definitely be sufficient.

If there's anything I can help with regarding for example testing, please let me know!

I changed the install to support local now (but not local/apps.conf) per the PR comments. Let me know if this satisfies your use case, otherwise we can go back to the drawing board!

Awesome, thank you :) Looks like that should definitely do the job for my use-case. I see it's in release 8.0.4.1, but that tag doesn't show up yet in Docker Hub, any ETA on when it will be available?

@svanschie I'm publishing it today. I'll update here when it's posted.

8.0.4.1 is now published!

I have just upgraded our Splunk environment to the latest version and can confirm the changes work like a charm for me :)

I don't explicitly set the SPLUNK_SECRET (yet), so I had to set the splunk.secret file on the deployer to the same value as the SH's, but that is to be expected.

Thank you for the work @nwang92 @alishamayor!