atom / watcher

Atom Filesystem Watcher

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Potential memory leak in native watcher

Arcanemagus opened this issue · comments

Description

There might be a very slow memory leak in the native watcher.

Steps to Reproduce

  1. Leave the native watcher running for over an hour.

Expected behavior:

No slow memory leak.

Actual behavior:

From the memory usage reported in the console log it appears that there is a slow memory leak in the native watcher, gaining 2.04 MiB over 18 hours.

main.log:

[ ..\src\log.cpp: 38] FileLogger opened.
[ ..\src\hub.cpp:114] Sending command [Message [CommandPayload id 1 log to file C:\Users\abneyl\watcher-logs\native\worker.log]] to Thread[worker thread].
[ ..\src\hub.cpp:114] Sending command [Message [CommandPayload id 2 log to file C:\Users\abneyl\watcher-logs\native\polling.log]] to Thread[polling thread].
[..\src\thread.cpp: 95] Processing offline command: [CommandPayload id 2 log to file C:\Users\abneyl\watcher-logs\native\polling.log].
[..\src\thread.cpp: 97] Result: OK.
[ ..\src\hub.cpp:143] Received ack message [Message [AckPayload ack 2]].
[ ..\src\hub.cpp:143] Received ack message [Message [AckPayload ack 1]].
[ ..\src\hub.cpp:114] Sending command [Message [CommandPayload id 3 add Z:\projects\foo\bar at channel 1]] to Thread[worker thread].
[ ..\src\hub.cpp:143] Received ack message [Message [AckPayload ack 3]].

worker.log:

[ ..\src\log.cpp: 38] FileLogger opened.
[..\src\worker\windows\windows_worker_platform.cpp:141] Added directory root Z:\projects\foo\bar at channel 1.
[..\src\worker\windows\subscription.cpp: 54] Scheduling the next change callback for channel 1.
[..\src\worker\windows\windows_worker_platform.cpp:206] Attempting to revert to a network-friendly buffer size.
[..\src\worker\windows\subscription.cpp: 54] Scheduling the next change callback for channel 1.

Native watcher console log.
Polling watcher console log (for comparison).

Reproduces how often:

Unknown, will update this issue if I see it again (and remember).

Versions

smashwilson/watcher-stress@5b969c2
@atom/watcher@0.0.4

As a note, a second instance was showing the same behavior, but after 1.05 days it dropped from 25.62 MiB to 19.34 MiB, so something got cleaned up. It then resumed slowly gaining memory again.

You know, I wonder if this is just GC running. I don't currently do anything to trigger GC in watcher-stress.

It's certainly possible, is the memory reporting watcher-stress, or the actual watcher's usage?

It's watcher-stress, and it just uses process methods to measure. So it might even be measuring the wrong things... ?

Ah, I was assuming it was measuring the memory of the actual watcher in question. If it's the entire usage of watcher-stress then that being GC definitely sounds plausible.

So it might even be measuring the wrong things... ?

At the least, I was assuming wrong in what was being measured 😛.