deis / logger

In-memory log buffer used by Deis Workflow.

Home Page:https://deis.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

duplicate logs from deis apps:logs

bacongobbler opened this issue · comments

After deploying workflow-beta4, I found that I saw duplicate logs for everything:

><> deis logs
INFO [go]: bacongobbler created initial release
INFO [go]: domain go added
INFO [go]: bacongobbler created initial release
INFO [go]: domain go added
INFO [go]: bacongobbler deployed 779ca3d
INFO [go]: bacongobbler deployed 779ca3d
go-v2-web-wusof -- 2016/05/10 20:10:47 Successfully copied home/go:git-779ca3d7/push/slug.tgz to slug.tgz
go-v2-web-wusof -- 2016/05/10 20:10:47 listening on 5000...
go-v2-web-wusof -- 2016/05/10 20:10:47 Successfully copied home/go:git-779ca3d7/push/slug.tgz to slug.tgz
go-v2-web-wusof -- 2016/05/10 20:10:47 listening on 5000...
go-v2-web-wusof -- 2016/05/10 20:11:24 10.246.25.7 GET /
go-v2-web-wusof -- 2016/05/10 20:11:24 10.246.25.7 GET /
go-v2-web-wusof -- 2016/05/10 20:12:39 10.246.25.1 GET /
go-v2-web-wusof -- 2016/05/10 20:12:39 10.246.25.1 GET /favicon.ico
go-v2-web-wusof -- 2016/05/10 20:12:39 10.246.25.1 GET /
go-v2-web-wusof -- 2016/05/10 20:12:39 10.246.25.1 GET /favicon.ico

Hmm, this looks like an issue with fluentd. From fluentd's logs:

2016-05-10 20:07:15 +0000 [info]: following tail of /var/log/containers/go-v2-web-wusof_go_POD-613201007c0c694f34c6d0bf9e4f0241088e308a36de3e30ede5f6bc47ff9770.log
2016-05-10 20:11:15 +0000 [info]: following tail of /var/log/containers/go-v2-web-wusof_go_go-web-ac0969abd3d674353e7022000091c5502853b822a7cd5a04bd04caa223bf930d.log

the _POD log doesnt have anything in it. I noticed this last release but it seemed to have cleared it self up. But it might be back. I'll look into it

yeah I just confirmed that

[vagrant@kubernetes-node-1 ~]$ sudo cat /var/log/containers/go-v2-web-wusof_go_go-web-ac0969abd3d674353e7022000091c5502853b822a7cd5a04bd04caa223bf930d.log
{"log":"2016/05/10 20:10:47 Successfully copied home/go:git-779ca3d7/push/slug.tgz to slug.tgz\n","stream":"stderr","time":"2016-05-10T20:10:47.472981917Z"}
{"log":"2016/05/10 20:10:47 listening on 5000...\n","stream":"stderr","time":"2016-05-10T20:10:47.556449434Z"}
{"log":"2016/05/10 20:11:24 10.246.25.7 GET /\n","stream":"stderr","time":"2016-05-10T20:11:24.480801751Z"}
{"log":"2016/05/10 20:12:39 10.246.25.1 GET /\n","stream":"stderr","time":"2016-05-10T20:12:39.518523791Z"}
{"log":"2016/05/10 20:12:39 10.246.25.1 GET /favicon.ico\n","stream":"stderr","time":"2016-05-10T20:12:39.584165798Z"}
[vagrant@kubernetes-node-1 ~]$ sudo cat /var/log/containers/go-v2-web-wusof_go_POD-613201007c0c694f34c6d0bf9e4f0241088e308a36de3e30ede5f6bc47ff9770.log

is there an existing closed issue that we should re-open? :)

no i dont think i ever opened an issue because noone else had this issue. ill use this one to work from

For posterity, here is the logger logs as well:

><> kd logs deis-logger-yzmau
2016/05/10 18:56:58 syslogish server running
2016/05/10 18:56:58 weblog server running on 0.0.0.0:8088
2016/05/10 18:56:58 deis-logger running
2016/05/10 20:02:42 Creating buffer for app:go
2016/05/10 20:12:43 Returning the last 100 lines for go
2016/05/10 20:14:32 Returning the last 100 lines for go

Which looks fine to me!

If this is an existing issue I think we can slate this for rc1, since no new PRs introduced this behaviour.

k let me see if i understand whats happening real quick so we know what to tell people who install beta4

Ok so given the regex we use to parse the controller log messages and log messages taken directly from a running controller i know it works -

https://www.debuggex.com/r/OEWkh_XoBBeNUCbE

My thought right now is that fluentd is sending duplicate messages.

Ok its definitely dupe logs from fluentd

Alright so the problem is here is where ever logger (and stdout metrics) is running we will get duplicate messages from all components running on the same host.