convoyinc / KafkaVaultProvider

Extending Kafka FileProvider to use Vault secrets

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

KafkaVaultProvider

An extension to the Kafka Connect ConfigProvider to support accessing Vault secrets via Hashicorp Vault API call.

CircleCI

Implements the interface ConfigProvider (org.apache.kafka.common.config.provider)

https://kafka.apache.org/21/javadoc/org/apache/kafka/common/config/provider/ConfigProvider.html

Type Method Description
ConfigData get(String key) Retrieves data from Vault path that matches a given key
ConfigData get(String url, Set keys) Retrieves values from Vault url with a given set of keys
void configure(Map<String, ?> configs) Sets internal configuration values needed to set up a vault connection

To use in Kafka Connect:

  1. Clone project and build Jar file.
  1. Put your final Jar file, say KafkaVaultProvider.jar, under the Kafka worker plugin folder. Default is /usr/share/java. PLUGIN_PATH in the Kafka worker config file.

  2. Upload all the dependency jars to PLUGIN_PATH as well. Use the META-INFO/MANIFEST.MF file inside your Jar file to configure the 'ClassPath' of dependent jars that your code will use.

  3. In kafka worker config file, create two additional properties:

CONNECT_CONFIG_PROVIDERS: 'vault', // Alias name of your ConfigProvider
CONNECT_CONFIG_PROVIDERS_VAULT_CLASS:'com.convoy.KafkaVaultProvider.KafkaVaultProvider',
  1. Restart workers

  2. Update your connector config file by curling POST to Kafka Restful API. In Connector config file, you could reference the value inside ConfigData returned from ConfigProvider:get(path, keys) by using the syntax like:

database.password=${mycustom:/path/pass/to/get/method:password}

ConfigData is a HashMap which contains {password: 123}

  1. If you still seeing ClassNotFound exception, probably your ClassPath is not setup correctly.

Note:

  • If you are using AWS ECS/EC2, you need to set the worker config file by setting the environment variable.
  • The worker config and connector config files are different.

Example usage:

  • Clone and build KafkaVaultProvider in docker-build terraform
# Clone and build KafkaVaultProvider
rm -rf KafkaVaultProvider
git clone https://github.com/convoyinc/KafkaVaultProvider.git
cd KafkaVaultProvider
mvn clean install -DskipTests
cd ..
  • Add JAR file in terraform Dockerfile
# Copy KafkaVaultProvider to plugin path
RUN mkdir -p ${CONNECT_PLUGIN_PATH}/KafkaVaultProvider/classes
ADD ./KafkaVaultProvider/target/KafkaVaultProvider.jar ${CONNECT_PLUGIN_PATH}/KafkaVaultProvider/KafkaVaultProvider.jar
ADD ./KafkaVaultProvider/target/classes/json-2019.jar ${CONNECT_PLUGIN_PATH}/KafkaVaultProvider/classes/json-2019.jar
  • Update deployment terraform container environment definition
{
  name: "CONNECT_CONFIG_PROVIDERS",
  value: 'file,vault'
},
{
  name: "CONNECT_CONFIG_PROVIDERS_FILE_CLASS",
  value: 'org.apache.kafka.common.config.provider.FileConfigProvider'
},
{
  name: "CONNECT_CONFIG_PROVIDERS_VAULT_CLASS",
  value: 'com.convoy.KafkaVaultProvider.KafkaVaultProvider'
}
  • Redeploy Kafka connect

Example Java usage:

  1. Configure
Map<String,String> configs = new HashMap<String,String>();
configs.put("request_uri_base", "http://127.0.0.1:8200/v1/");
configs.put("secret_type", "secret");
configs.put("secret_directory", "dummy-secrets");
configs.put("x_vault_token", "token-here");
kvp.configure(configs);
  1. Retrieve Value
 kvp.get("secretPassword").data().get("secretKey")

Helpful info:


Special Thanks to: @adriank-convoy @Samlinxia

About

Extending Kafka FileProvider to use Vault secrets

License:Apache License 2.0


Languages

Language:Java 100.0%