societe-generale / slf4j-metrics-publisher

a utils library to easily publish metrics through LogstashAppender and SLF4J

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

slf4j-metrics-publisher

Build Status Coverage Status Maven Central

Context

This library is used when using Elastic stack, and sending the logs to Logstash using Logback with a config similar to this :

  <appender name="LOGSTASH" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>${LOGSTASH_URL}</destination>
        <encoder class="net.logstash.logback.encoder.LogstashEncoder">
           <customFields>{"service.name":"my-service-name",
              "service.hostname":"\${HOSTNAME}",
              "service.version": "${project_version}"}
           </customFields>
        </encoder>
        <keepAliveDuration>1 minutes</keepAliveDuration>
  </appender> 

With above configuration and a properly configured Elastic stack, your applicative logs are sent to logstash "wrapped" as a Json document, and you'll be able to browse them in Kibana.

But what if in addition to the applicative logs, you would also like to send custom metrics ?

Getting started

This library contains only a single class, but packaging it like this brings consistency to the projects that use it (instead of copy/pasting the class in all your projects).

Import the library in your Maven project by adding this dependency :

<dependency>
  <groupId>com.societegenerale.commons</groupId>
  <artifactId>slf4j-metrics-publisher</artifactId>
  <version>1.0.0</version>
</dependency>

Publishing the metric and its value(s)

Once the library is added to your classpath, creating and publishing events is as simple as this :

  Metric userLoggedInMetric=Metric.functional("user-logged-in");
  userLoggedInMetric.addAttribute("duration",duration);
  userLoggedInMetric.publish(); 
  • We first create the metric, giving it a type (functional) and a name (user-logged-in)
  • We then add an attribute to it (duration) : it's a key/value pair, the value being a String. The attributes are stored in a Map, and we can add as many entries as we want.
  • When "publishing" the metric, the attributes entries are read, and put in the MDC (see here for more infos), and a regular logging call is made. But it is "enriched" with all the key/value pairs from the MDC, ie our metric and its attributes.

Receiving the metric

The metric is "piggy-backing" on a regular log event, encoded in Json and sent to Logstash. It has special attributes that enable to configure a filter in Logstash and redirect the Json document to a special Elastic index (if that's what you want to do).

The Json document will have these basic attributes :

  • metricName : in our example above, it's user-logged-in
  • metricType : typically, TECHNICAL or FUNCTIONAL - but you can create metrics with custom types.

but also all the attributes that you've added in it.

Outcome

By storing these "metric documents" in a special ElasticSearch index, you can then make a Kibana dashboard showing them in few minutes, so for example the number of "user-logged-in" events, and the min/max/average/percentiles duration for it.

About

a utils library to easily publish metrics through LogstashAppender and SLF4J

License:Apache License 2.0


Languages

Language:Java 96.9%Language:Shell 3.1%