Nebo15 / logger_json

JSON logger formatter with support for Google Cloud, DataDog and other for Elixir.

Home Page:https://nebo15.github.io/logger_json/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DatadogLogger passing all JSON data to `message` field and not being parsed correctly

Nezteb opened this issue · comments

I set up logger_json with a Phoenix service and am using the DatadogLogger (with all the same configs as in the README).

However, whenever I log something now, I see this in DataDog:
Screen Shot 2022-11-23 at 1 16 24 PM

When clicking the "copy log as JSON" button in the top right, I get:

{
	"id": "<ID>",
	"content": {
		"timestamp": "2022-11-23T22:11:47.677Z",
		"tags": [
			"<REDACTED TAGS>"
		],
		"service": "<SERVICE NAME>",
		"message": "<REDACTED JSON I WANT DATADOG TO ACTUALLY USE/PARSE>",
		"attributes": {
			"service": "<SERVICE NAME>",
			"source": "stdout",
			"timestamp": 1669241507677
		}
	}
}

So, using the example from the README:

{
  "domain": ["elixir"],
  "duration": 3863403,
  "http": {
    "url": "http://localhost/create-account",
    "status_code": 200,
    "method": "GET",
    "referer": "http://localhost:4000/login",
    "request_id": "http_FlDCOItxeudZJ20AAADD",
    "useragent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Safari/537.36",
    "url_details": {
      "host": "localhost",
      "port": 4000,
      "path": "/create-account",
      "queryString": "",
      "scheme": "http"
    }
  },
  "logger": {
    "thread_name": "#PID<0.1042.0>",
    "method_name": "Elixir.LoggerJSON.Plug.call/2"
  },
  "message": "",
  "network": {
    "client": {
      "ip": "127.0.0.1"
    }
  },
  "phoenix": {
    "controller": "Elixir.RecognizerWeb.Accounts.UserRegistrationController",
    "action": "new"
  },
  "request_id": "http_FlDCOItxeudZJ20AAADD",
  "syslog": {
    "hostname": [10, 10, 100, 100, 100, 100, 100],
    "severity": "info",
    "timestamp": "2020-12-14T19:16:55.088Z"
  }
}

All of that information gets shoved into the message field in DataDog so it's not parsed and turned into attributes:

  "message": "{\"domain\":[\"elixir\"],\"duration\":3863403,\"http\":{\"url\":\"http://localhost/create-account\",\"status_code\":200,\"method\":\"GET\",\"referer\":\"http://localhost:4000/login\",\"request_id\":\"http_FlDCOItxeudZJ20AAADD\",\"useragent\":\"Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.66 Safari/537.36\",\"url_details\":{\"host\":\"localhost\",\"port\":4000,\"path\":\"/create-account\",\"queryString\":\"\",\"scheme\":\"http\"}},\"logger\":{\"thread_name\":\"#PID<0.1042.0>\",\"method_name\":\"Elixir.LoggerJSON.Plug.call/2\"},\"message\":\"\",\"network\":{\"client\":{\"ip\":\"127.0.0.1\"}},\"phoenix\":{\"controller\":\"Elixir.RecognizerWeb.Accounts.UserRegistrationController\",\"action\":\"new\"},\"request_id\":\"http_FlDCOItxeudZJ20AAADD\",\"syslog\":{\"hostname\":[10,10,100,100,100,100,100],\"severity\":\"info\",\"timestamp\":\"2020-12-14T19:16:55.088Z\"}}"

The only thing I suspect at this point is the DataDog agent itself or the FluentBit container we use for log routing. We don't have any crazy custom configuration for these though. The DataDog agent has the following env vars set up:

ECS_FARGATE=true
DD_ENV=dev
DD_APM_ENABLED=true
DD_PROFILING_ENABLED=true
DD_PROCESS_AGENT_ENABLED=true
DD_DOGSTATSD_NON_LOCAL_TRAFFIC=true

But as far as I can tell from DataDog docs none of these should be changing how JSON logs are being parsed. So my last guess is that I've set up logger_json incorrectly?

Any and all ideas/help is appreciated!

Hey. I feel like you should start by digging into the FluentBit configuration, it looks like it doesn't have a rule to properly parse JSON in the container's stdout file (you can tail it to see if LoggerJSON does its job and sends formatted logs there first).

@AndrewDryga I would but my entire team got laid off today so this is no longer my problem. 😄 In the end I doubt it's a logger_json issue but I wanted to at least make an issue and see what others thought.

Take care!

I'm sorry to hear that @Nezteb, I know how that feels. Please take care of yourself, I bet you and the other guys will find a better place to work.