ksync / ksync

Sync files between your local system and a kubernetes cluster.

Home Page:https://ksync.github.io/ksync/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

No changes written to file

Panaetius opened this issue · comments

I'm running ksync in a DIND local cluster scenario.

I've configured everything, ran ksync init then ksync create --name coordinator -l app=mlbench,component=coordinator --context local --reload=false /home/zenon/DEV/epfl/mlbench/mlbench/coordinator/ /app/code/

(I tried with reload=true as well, but the application should reload by itself anyways)

sync watch works as well , debug output:

ksync watch --log-level debug
DEBU[0000] initializing kubernetes client context=local
DEBU[0000] kubernetes client created host="http://localhost:8080"
DEBU[0000] watching for updates ContainerName= LocalPath=/home/zenon/DEV/epfl/mlbench/mlbench/coordinator/ LocalReadOnly=false Name=coordinator Namespace=default Pod= Reload=false RemotePath=/app/code/ RemoteReadOnly=false Selector="app=mlbench,component=coordinator"
DEBU[0000] cleaning background daemon
DEBU[0000] starting syncthing args="[/home/zenon/.ksync/bin/syncthing -gui-address localhost:8384 -gui-apikey ksync -home /home/zenon/.ksync/syncthing -no-browser]" cmd=/home/zenon/.ksync/bin/syncthing
INFO[0000] listening bind=127.0.0.1 port=40322
DEBU[0000] RESTY 2018/07/06 16:14:26 ERROR [Get http://localhost:8384/rest/system/config: dial tcp [::1]:8384: connect: connection refused] Attempt [1]
DEBU[0000] new event deleted=false name=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb status=Running type=ADDED
INFO[0000] new pod detected pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
DEBU[0000] added service ID=9b01da22760a145e12ecc4d03c02d64f4fdcb0804f81c36c96cc7be10612e07d Name=mlbench-coordinator NodeName=kube-node-2 PodName=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb
DEBU[0000] checking to see if radar is ready nodeName=kube-node-2
DEBU[0000] found pod name Namespace=kube-system RadarPort=40321 SyncthingAPI=8384 SyncthingListener=22000 nodeName=kube-node-2 podName=ksync-dmp8h
DEBU[0000] found pod nodeName=kube-node-2 podName=ksync-dmp8h status=Running
DEBU[0000] starting tunnel LocalPort=42425 Namespace=kube-system Out= PodName=ksync-dmp8h RemotePort=40321 url="http://localhost:8080/api/v1/namespaces/kube-system/pods/ksync-dmp8h/portforward"
DEBU[0000] tunnel running LocalPort=42425 Namespace=kube-system Out="Forwarding from 127.0.0.1:42425 -> 40321\nForwarding from [::1]:42425 -> 40321\n" PodName=ksync-dmp8h RemotePort=40321
DEBU[0000] [monitor] 16:14:26 INFO: Starting syncthing name=syncthing
DEBU[0000] [start] 16:14:26 INFO: Generating ECDSA key and certificate for syncthing... name=syncthing
DEBU[0000] [CV3OM] 16:14:26 INFO: syncthing v0.14.48 "Dysprosium Dragonfly" (go1.10.2 linux-amd64) teamcity@build.syncthing.net 2018-05-14 06:53:06 UTC name=syncthing
DEBU[0000] [CV3OM] 16:14:26 INFO: My ID: CV3OM2R-UEGMGXC-DZXVZTE-AZTPJMX-ELTLMGM-7G47SQH-LQHAYYL-FWLOPQH name=syncthing
DEBU[0000] checking to see if radar is ready nodeName=kube-node-2
DEBU[0000] found pod name Namespace=kube-system RadarPort=40321 SyncthingAPI=8384 SyncthingListener=22000 nodeName=kube-node-2 podName=ksync-dmp8h
DEBU[0000] found pod nodeName=kube-node-2 podName=ksync-dmp8h status=Running
DEBU[0000] starting tunnel LocalPort=39651 Namespace=kube-system Out= PodName=ksync-dmp8h RemotePort=8384 url="http://localhost:8080/api/v1/namespaces/kube-system/pods/ksync-dmp8h/portforward"
DEBU[0000] tunnel running LocalPort=39651 Namespace=kube-system Out="Forwarding from 127.0.0.1:39651 -> 8384\nForwarding from [::1]:39651 -> 8384\n" PodName=ksync-dmp8h RemotePort=8384
DEBU[0000] checking to see if radar is ready nodeName=kube-node-2
DEBU[0000] found pod name Namespace=kube-system RadarPort=40321 SyncthingAPI=8384 SyncthingListener=22000 nodeName=kube-node-2 podName=ksync-dmp8h
DEBU[0000] found pod nodeName=kube-node-2 podName=ksync-dmp8h status=Running
DEBU[0000] starting tunnel LocalPort=35347 Namespace=kube-system Out= PodName=ksync-dmp8h RemotePort=22000 url="http://localhost:8080/api/v1/namespaces/kube-system/pods/ksync-dmp8h/portforward"
DEBU[0000] tunnel running LocalPort=35347 Namespace=kube-system Out="Forwarding from 127.0.0.1:35347 -> 22000\nForwarding from [::1]:35347 -> 22000\n" PodName=ksync-dmp8h RemotePort=22000
DEBU[0000] RESTY 2018/07/06 16:14:26 ERROR [Get http://localhost:8384/rest/system/config: dial tcp [::1]:8384: connect: connection refused] Attempt [1]
DEBU[0001] RESTY 2018/07/06 16:14:27 ERROR [Get http://localhost:8384/rest/system/config: dial tcp [::1]:8384: connect: connection refused] Attempt [2]
DEBU[0001] [CV3OM] 16:14:27 INFO: Single thread SHA256 performance is 235 MB/s using minio/sha256-simd (174 MB/s using crypto/sha256). name=syncthing
DEBU[0001] [CV3OM] 16:14:27 INFO: Archiving a copy of old config file format at: /home/zenon/.ksync/syncthing/config.xml.v26 name=syncthing
DEBU[0001] RESTY 2018/07/06 16:14:27 ERROR [Get http://localhost:8384/rest/system/config: dial tcp [::1]:8384: connect: connection refused] Attempt [2]
DEBU[0002] [CV3OM] 16:14:28 INFO: Hashing performance is 206.12 MB/s name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Updating database schema version from 0 to 2... name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Updated symlink type for 0 index entries and added 0 invalid files to global list name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Finished updating database schema version from 0 to 2 name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Overall send rate is unlimited, receive rate is unlimited name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Rate limits do not apply to LAN connections name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: TCP listener ([::]:22000) starting name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Device CV3OM2R-UEGMGXC-DZXVZTE-AZTPJMX-ELTLMGM-7G47SQH-LQHAYYL-FWLOPQH is "ZENON-PC" at [dynamic] name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Loading HTTPS certificate: open /home/zenon/.ksync/syncthing/https-cert.pem: no such file or directory name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Creating new HTTPS certificate name=syncthing
DEBU[0002] RESTY 2018/07/06 16:14:28 ERROR [Get http://localhost:8384/rest/system/config: dial tcp [::1]:8384: connect: connection refused] Attempt [3]
DEBU[0002] [CV3OM] 16:14:28 INFO: GUI and API listening on 127.0.0.1:8384 name=syncthing
DEBU[0002] [CV3OM] 16:14:28 INFO: Access the GUI via the following URL: http://localhost:8384/ name=syncthing
DEBU[0003] [CV3OM] 16:14:29 INFO: Adding folder "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" (coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb) name=syncthing
DEBU[0003] [CV3OM] 16:14:29 INFO: No stored folder metadata for "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb": recalculating name=syncthing
DEBU[0003] [CV3OM] 16:14:29 INFO: Ready to synchronize "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" (coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb) (readwrite) name=syncthing
DEBU[0003] [CV3OM] 16:14:29 INFO: Completed initial scan of readwrite folder "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" (coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb) name=syncthing
INFO[0003] updating pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
INFO[0003] updating pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
WARN[0003] {"alloc":9305640,"connectionServiceStatus":{"tcp://0.0.0.0:22000":{"lanAddresses":["tcp://0.0.0.0:22000"],"wanAddresses":["tcp://0.0.0.0:22000"]}},"cpuPercent":0,"goroutines":42,"myID":"CV3OM2R-UEGMGXC-DZXVZTE-AZTPJMX-ELTLMGM-7G47SQH-LQHAYYL-FWLOPQH","pathSeparator":"/","startTime":"2018-07-06T16:14:26.610132871+02:00","sys":40863992,"tilde":"/home/zenon","uptime":2,"urVersionMax":3}
INFO[0003] syncthing listening port=8384 syncthing=localhost
WARN[0003] {"alloc":9332440,"connectionServiceStatus":{"tcp://0.0.0.0:22000":{"lanAddresses":["tcp://0.0.0.0:22000"],"wanAddresses":["tcp://0.0.0.0:22000"]}},"cpuPercent":0,"goroutines":42,"myID":"CV3OM2R-UEGMGXC-DZXVZTE-AZTPJMX-ELTLMGM-7G47SQH-LQHAYYL-FWLOPQH","pathSeparator":"/","startTime":"2018-07-06T16:14:26.610132871+02:00","sys":40863992,"tilde":"/home/zenon","uptime":2,"urVersionMax":3}
DEBU[0003] restarting local syncthing
DEBU[0003] [CV3OM] 16:14:29 INFO: Restarting name=syncthing
INFO[0003] finished unary call with code OK grpc.code=OK grpc.method=RestartSyncthing grpc.service=proto.ksync.Ksync grpc.start_time="2018-07-06T16:14:29+02:00" grpc.time_ns=718597 peer.address="127.0.0.1:40044" span.kind=server system=grpc
DEBU[0003] [CV3OM] 16:14:29 INFO: Exiting name=syncthing
INFO[0003] folder sync running pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
DEBU[0003] RESTY 2018/07/06 16:14:29 ERROR [unexpected EOF] Attempt [1]
DEBU[0003] [monitor] 16:14:29 INFO: Syncthing exited: exit status 3 name=syncthing
DEBU[0004] [monitor] 16:14:30 INFO: Starting syncthing name=syncthing
DEBU[0004] RESTY 2018/07/06 16:14:30 ERROR [Get http://localhost:8384/rest/events?since=14: dial tcp [::1]:8384: connect: connection refused] Attempt [2]
DEBU[0004] [CV3OM] 16:14:30 INFO: syncthing v0.14.48 "Dysprosium Dragonfly" (go1.10.2 linux-amd64) teamcity@build.syncthing.net 2018-05-14 06:53:06 UTC name=syncthing
DEBU[0004] [CV3OM] 16:14:30 INFO: My ID: CV3OM2R-UEGMGXC-DZXVZTE-AZTPJMX-ELTLMGM-7G47SQH-LQHAYYL-FWLOPQH name=syncthing
DEBU[0005] [CV3OM] 16:14:31 INFO: Single thread SHA256 performance is 235 MB/s using minio/sha256-simd (177 MB/s using crypto/sha256). name=syncthing
DEBU[0006] RESTY 2018/07/06 16:14:32 ERROR [Get http://localhost:8384/rest/events?since=14: dial tcp [::1]:8384: connect: connection refused] Attempt [3]
DEBU[0006] [CV3OM] 16:14:32 INFO: Hashing performance is 198.40 MB/s name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Ready to synchronize "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" (coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb) (readwrite) name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Overall send rate is unlimited, receive rate is unlimited name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Rate limits do not apply to LAN connections name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: TCP listener ([::]:22000) starting name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Device CV3OM2R-UEGMGXC-DZXVZTE-AZTPJMX-ELTLMGM-7G47SQH-LQHAYYL-FWLOPQH is "ZENON-PC" at [dynamic] name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Device HCF2NSO-ALBGN5R-4E7LGV2-CNNTDYA-EKWTBCK-44IA6QK-2IAXMHI-LK2KVQL is "virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" at [tcp://127.0.0.1:35347] name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Completed initial scan of readwrite folder "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" (coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb) name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: GUI and API listening on 127.0.0.1:8384 name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Access the GUI via the following URL: http://localhost:8384/ name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Established secure connection to HCF2NSO-ALBGN5R-4E7LGV2-CNNTDYA-EKWTBCK-44IA6QK-2IAXMHI-LK2KVQL at 127.0.0.1:36380-127.0.0.1:35347/tcp-client (TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305) name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Device HCF2NSO-ALBGN5R-4E7LGV2-CNNTDYA-EKWTBCK-44IA6QK-2IAXMHI-LK2KVQL client is "syncthing v0.14.48" named "ksync-dmp8h" at 127.0.0.1:36380-127.0.0.1:35347/tcp-client name=syncthing
DEBU[0006] [CV3OM] 16:14:32 INFO: Device HCF2NSO-ALBGN5R-4E7LGV2-CNNTDYA-EKWTBCK-44IA6QK-2IAXMHI-LK2KVQL folder "coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb" (coordinator-virtuous-fly-mlbench-coordinator-699747c74b-2gnmb) has a new index ID (0x84FED7B664DF945D) name=syncthing
INFO[0007] updating pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
INFO[0007] update complete pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
INFO[0018] updating pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
INFO[0018] update complete pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
INFO[0018] updating pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator
INFO[0018] update complete pod=virtuous-fly-mlbench-coordinator-699747c74b-2gnmb spec=coordinator

so everything seems to work ok.

ksync get outputs:
NAME LOCAL REMOTE STATUS POD CONTAINER +-------------+---------------------+------------+----------+---------------------------------------------------+-----------+ coordinator mlbench/coordinator /app/code/ watching virtuous-fly-mlbench-coordinator-699747c74b-2gnmb

but if I connect to the pod and look at the content of a file I changed locally, it still has the original content. I.e. changes don't take effect even though ksync registers the change and writes "update complete"

Changing the remote file (kubectl exec -it virtuous-fly-mlbench-coordinator-699747c74b-2gnmb -- /bin/sh and editing it) doesn't get registered in ksync watch or syncthing.

Since everything seems to be working, except it's not working, I'm a bit at a loss and any help would be appreciated.

commented

@Panaetius it can be confusing to debug at this point, so let me clear a few pieces up.

I'm not entirely sure that the -l some=something,somethingelse=somewhere (i.e. with the comma separator) works in ksync, but admittedly I haven't tried it. Usually I specify multiple selectors with multiple -l flags. I only mention it because I suspect the problem may be that the entire selector comment was ingested and processed as a single string, meaning nothing would match. Try recreating your ksync create and see what happens.

The "pod reload" triggering/complete messages are actually local, generated from events emitted by the remote syncthing event stream. Since the selectors aren't applied when looking for a pod (after it is first found) that would explain why the reloads were triggered when things changed locally, but the file was not changed.

As a side note, reload just determines whether the remote container is rolled after any file change.

See if that change fixes things for you and we can work from there. If not, you might also try specifying a pod without the selectors and see if that works (as a test).

@timfallmk Regarding the -l some=something,somethingelse=somewhere, that indeed does not work, it just picks the first selector specified and ignores the rest. I already tried it with multiple -l flags yesterday, that didn't work either.

I just tried
ksync create -p jumpy-rodent-mlbench-coordinator-6bc59dd765-5hglw --local-read-only --name coordinator --context=local /home/zenon/DEV/epfl/mlbench/mlbench/coordinator/ /app/code/

ksync create -p jumpy-rodent-mlbench-coordinator-6bc59dd765-5hglw --local-read-only --name coordinator /home/zenon/DEV/epfl/mlbench/mlbench/coordinator/ /app/code/

ksync create -p jumpy-rodent-mlbench-coordinator-6bc59dd765-5hglw --local-read-only --name coordinator --context=local --reload=false /home/zenon/DEV/epfl/mlbench/mlbench/coordinator/ /app/code/

with the same results (not working).

$ kubectl get pods
NAME READY STATUS RESTARTS AGE
jumpy-rodent-mlbench-coordinator-6bc59dd765-5hglw 1/1 Running 0 20h
jumpy-rodent-mlbench-experiment-94ff54498-rjjwt 1/1 Running 0 20h
jumpy-rodent-mlbench-experiment-94ff54498-t8pcw 1/1 Running 0 20h

I checked all 3 containers to see if there's files there, just to make sure it didn't pick the wrong folder. None of them contain my changed file (it's called views.py and I did find / -iname "views.py" on all three containers. Only the one on coordinator (the one I'm trying to use ksync with) exists and that one still has the original content.

commented

If you visit the syncthing web UI (I believe it defaults to 8384), does it show the correct target destinations?

It looks fine, the paths are correct and it lists modifications to the file correctly.

screenshot_2018-07-10_22-35-25
screenshot_2018-07-10_22-34-56

Are there any additional logs I could provide to help solve the issue?

And thank you for the support!

commented

That's very odd. The only thing I can think of, other than the above, is a file permissions issue. Even that should trigger an error though.

@grampelberg ideas?

ksync makes some assumptions about how the FS is laid out under the covers. What FS driver are you using and where is it putting files? I'm wondering if DIND is the issue.

@grampelberg I use https://github.com/kubernetes-sigs/kubeadm-dind-cluster if that helps, v1.10.

It's all Overlay2 and stored in /var/lib/docker and /dind/docker on the host and in the kubernetes nodes, respectively.

On the host:

$ docker info
Containers: 52
Running: 4
Paused: 0
Stopped: 48
Images: 188
Server Version: 18.05.0-ce
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Native Overlay Diff: false
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file logentries splunk syslog
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 773c489c9c1b21a6d78b5c538cd395416ec50f88
runc version: 4fc53a81fb7c994640722ac585fa9ca548971871
init version: 949e6fa
Security Options:
seccomp
Profile: default
Kernel Version: 4.17.3-1-ARCH
Operating System: Arch Linux
OSType: linux
Architecture: x86_64
CPUs: 12
Total Memory: 31.36GiB
Name: ZENON-PC
ID: SCNS:IMKI:NVHE:GUYO:VWEH:QUHC:FQFR:N3NJ:AV7Z:WF2A:S2HW:LXQS
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
Labels:
Experimental: false
Insecure Registries:
localhost:5000
127.0.0.0/8
Live Restore Enabled: false

Inside the kubernetes master:

$ docker exec -it ec7e0 /bin/sh
# docker info
Containers: 19
Running: 11
Paused: 0
Stopped: 8
Images: 5
Server Version: 17.03.2-ce
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Native Overlay Diff: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host macvlan null overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 4ab9917febca54791c5f071a9d1f404867857fcc
runc version: 54296cf40ad8143b62dbcaa1d90e520a2136ddfe
init version: 949e6fa
Security Options:
seccomp
Profile: default
Kernel Version: 4.17.3-1-ARCH
Operating System: Debian GNU/Linux 9 (stretch) (containerized)
OSType: linux
Architecture: x86_64
CPUs: 12
Total Memory: 31.36 GiB
Name: kube-master
ID: C4TN:JNI7:SJIY:PC6Y:TWNT:GS3L:UW5F:TDBC:EQ5V:TOF6:CMLF:VIMV
Docker Root Dir: /dind/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
Experimental: false
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false

WARNING: bridge-nf-call-iptables is disabled
WARNING: bridge-nf-call-ip6tables is disabled

Inside a kubernetes node:

$ docker exec -it 380d08 /bin/sh
# docker info
Containers: 10
Running: 10
Paused: 0
Stopped: 0
Images: 7
Server Version: 17.03.2-ce
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Native Overlay Diff: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host macvlan null overlay
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 4ab9917febca54791c5f071a9d1f404867857fcc
runc version: 54296cf40ad8143b62dbcaa1d90e520a2136ddfe
init version: 949e6fa
Security Options:
seccomp
Profile: default
Kernel Version: 4.17.3-1-ARCH
Operating System: Debian GNU/Linux 9 (stretch) (containerized)
OSType: linux
Architecture: x86_64
CPUs: 12
Total Memory: 31.36 GiB
Name: kube-node-1
ID: CR2C:LL2W:JZGT:XZZE:AH5Z:5MIP:OYWC:GUOM:7YKE:4BNF:LJLW:J7NC
Docker Root Dir: /dind/docker
Debug Mode (client): false
Debug Mode (server): false
Registry: https://index.docker.io/v1/
Experimental: false
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false

WARNING: bridge-nf-call-iptables is disabled
WARNING: bridge-nf-call-ip6tables is disabled

Yup, that's the problem. We're only mounting /var/lib/docker in (https://github.com/vapor-ware/ksync/blob/master/pkg/ksync/cluster/daemon_set.go#L78).

The solution to this should be adding a configuration option to install that lets you configure that path. @timfallmk want to take a look?

commented

It would probably make more sense to detect what the docker daemon is pointed at, but I'll look into it.

Great, looking forward to it being configurable. Thanks a lot for the quick responses

For now I'll try just getting the dind cluster to use the regular path, I think changing the /dind/docker path on line 89 of https://github.com/kubernetes-sigs/kubeadm-dind-cluster/blob/5e48a6377148b1e0d1b3d8c4609a140c7b00b57b/image/rundocker should do. Haven't had a chance to test it yet. I'll report back once I tried it, in case there's some other issues with kubeadm DIND.

commented

@Panaetius It might take some time before I get to it. In the meantime, changing the configured volume mount @grampelberg listed and recompiling should let you use it.

Good call @timfallmk, that's just a daemonset. Just use kubectl to edit the daemonset to the right path and it should (tm) work (hopefully).

commented

@timfallmk

I just tried again with the newest version and sadly it still doesn't seem to work.

I connected to the syncthing service running on the node and enabled (most) debug log settings. The path is now correct. E.g.:

http: GET "/rest/system/log?since=2018-07-12T09%3A20%3A28.292799137Z": status 200, 716 bytes in 0.09 ms
2018-07-12 09:20:32 read IndexUpdate message
2018-07-12 09:20:32 queueing IndexUpdate(2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2, concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx, 1 files)
2018-07-12 09:20:32 Index update (in): 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 / "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx": 1 files
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx Update(2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2, [1])
2018-07-12 09:20:32 insert; folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" device=2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 File{Name:"main/views.py", Sequence:57, Permissions:0644, ModTime:2018-07-12 09:20:31.590656547 +0000 UTC, Version:{[{J7HTQSK 9} {2AMD5RW 6}]}, Length:900, Deleted:false, Invalid:false, NoPermissions:false, BlockSize:131072, Blocks:[Block{0/900/1563634915/51e3f54be12434506e24c60afc79af3f73349cb2e3ed24131c08cffb9fe22e54}]}
2018-07-12 09:20:32 update global; folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" device=2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 file="main/views.py" version={[{J7HTQSK 9} {2AMD5RW 6}]} invalid=false
2018-07-12 09:20:32 new global after update: {{{[{J7HTQSK 9} {2AMD5RW 6}]}, 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2}, {{[{J7HTQSK 9} {2AMD5RW 5}]}, 7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4}}
2018-07-12 09:20:32 log 87 RemoteIndexUpdated map[items:1 version:57 device:2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx]
2018-07-12 09:20:32 Device 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 sent an index update for "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" with 1 items
2018-07-12 09:20:32 http: GET "/rest/events?since=81": status 200, 282 bytes in 58212.87 ms
2018-07-12 09:20:32 http: GET "/rest/events?since=81": status 200, 282 bytes in 31767.09 ms
2018-07-12 09:20:32 size.go:85 basic /var/syncthing/config/ Usage . {90801340416 203373813760}
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 pulling (ignoresChanged=false)
2018-07-12 09:20:32 log 88 StateChanged map[to:syncing from:idle duration:58.237464443 folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx]
2018-07-12 09:20:32 Folder "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" is now syncing
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 copiers: 2 pullerPendingKiB: 32768
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx WithNeed(7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4)
2018-07-12 09:20:32 need folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" device=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4 name="main/views.py" need=true have=true invalid=false haveV={[{J7HTQSK 9} {2AMD5RW 5}]} globalV={[{J7HTQSK 9} {2AMD5RW 6}]} globalDev=2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2
2018-07-12 09:20:32 stats.DeviceStatisticsReference.WasSeen: 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2
2018-07-12 09:20:32 open: open /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/main/.syncthing.views.py.tmp: no such file or directory
2018-07-12 09:20:32 log 89 ItemStarted map[type:file action:update folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx item:main/views.py]
2018-07-12 09:20:32 Started syncing "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" / "main/views.py" (update file)
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 need file main/views.py; copy 1, reused 0
2018-07-12 09:20:32 progress emitter: registering concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx main/views.py
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 sharedPullerState concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx main/views.py pullNeeded start -> 1
2018-07-12 09:20:32 model@0xc420096fc0 REQ(out): 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2: "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" / "main/views.py" o=0 s=900 h=51e3f54be12434506e24c60afc79af3f73349cb2e3ed24131c08cffb9fe22e54 wh=5d332ce3 ft=false
2018-07-12 09:20:32 wrote 148 bytes on the wire (2 bytes length, 4 bytes header, 4 bytes message length, 138 bytes message (132 uncompressed)), err=
2018-07-12 09:20:32 read Response message
2018-07-12 09:20:32 sharedPullerState concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx main/views.py pullNeeded done -> 0
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 http: GET "/rest/events?since=82": status 200, 461 bytes in 0.15 ms
2018-07-12 09:20:32 http: GET "/assets/img/favicon-sync.png": status 304, 0 bytes in 0.05 ms
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 closing main/views.py
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Sending main/views.py non-remove
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Sending main/views.py non-remove
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Ignoring main/.syncthing.views.py.tmp
2018-07-12 09:20:32 log 90 ItemFinished map[action:update folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx item:main/views.py error: type:file]
2018-07-12 09:20:32 progress emitter: deregistering concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx main/views.py
2018-07-12 09:20:32 aggregator/"coordinator-s-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" (coordinator-s-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx): Skipping path we modified: main/views.py
2018-07-12 09:20:32 aggregator/"coordinator-s-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" (coordinator-s-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx): Skipping path we modified: main/views.py
2018-07-12 09:20:32 Finished syncing "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" / "main/views.py" (update file): Success
2018-07-12 09:20:32 http: GET "/rest/events?since=84": status 200, 246 bytes in 45.53 ms
2018-07-12 09:20:32 aggregator/"concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" (concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx): Skipping path we modified: main/views.py
2018-07-12 09:20:32 aggregator/"concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" (concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx): Creating eventDir at: main
2018-07-12 09:20:32 aggregator/"concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" (concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx): Tracking (type non-remove): main/views.py
2018-07-12 09:20:32 aggregator/"concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" (concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx): Resetting notifyTimer to 1s
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Sending main/views.py non-remove
2018-07-12 09:20:32 basic /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/ Watch: Sending main/views.py non-remove
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx Update(7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4, [1])
2018-07-12 09:20:32 removing sequence; folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" sequence=39 main/views.py
2018-07-12 09:20:32 adding sequence; folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" sequence=40 main/views.py
2018-07-12 09:20:32 insert; folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" device=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4 File{Name:"main/views.py", Sequence:40, Permissions:0644, ModTime:2018-07-12 09:20:31.590656547 +0000 UTC, Version:{[{J7HTQSK 9} {2AMD5RW 6}]}, Length:900, Deleted:false, Invalid:false, NoPermissions:false, BlockSize:131072, Blocks:[Block{0/900/1563634915/51e3f54be12434506e24c60afc79af3f73349cb2e3ed24131c08cffb9fe22e54}]}
2018-07-12 09:20:32 update global; folder="concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" device=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4 file="main/views.py" version={[{J7HTQSK 9} {2AMD5RW 6}]} invalid=false
2018-07-12 09:20:32 new global after update: {{{[{J7HTQSK 9} {2AMD5RW 6}]}, 7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4}, {{[{J7HTQSK 9} {2AMD5RW 6}]}, 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2}}
2018-07-12 09:20:32 log 91 LocalIndexUpdated map[version:40 folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx items:1 filenames:[main/views.py]]
2018-07-12 09:20:32 log 92 RemoteChangeDetected map[path:main/views.py modifiedBy:2AMD5RW folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx folderID:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx label:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx action:modified type:file]
2018-07-12 09:20:32 stats.FolderStatisticsReference.ReceivedFile: concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx main/views.py
2018-07-12 09:20:32 http: GET "/rest/events?since=85": status 200, 234 bytes in 36.62 ms
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx WithHaveSequence(40)
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 changed 1
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 copiers: 2 pullerPendingKiB: 32768
2018-07-12 09:20:32 Remote change detected in folder "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx": modified file main/views.py
2018-07-12 09:20:32 Sending indexes for concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx to 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 at 127.0.0.1:22000-127.0.0.1:36182/tcp-server: 1 files (<127 bytes) (last batch)
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx WithNeed(7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4)
2018-07-12 09:20:32 wrote 211 bytes on the wire (2 bytes length, 4 bytes header, 4 bytes message length, 201 bytes message (201 uncompressed)), err=
2018-07-12 09:20:32 sendReceiveFolder/concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx@0xc4200b9680 changed 0
2018-07-12 09:20:32 log 93 StateChanged map[folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx to:idle from:syncing duration:0.103977409]
2018-07-12 09:20:32 Folder "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" is now idle
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx WithNeedTruncated(7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4)
2018-07-12 09:20:32 progress emitter: bytes completed for concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx: 0
2018-07-12 09:20:32 model@0xc420096fc0 NeedSize("concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx"): {0 0 0 0 0 0 []}
2018-07-12 09:20:32 log 94 FolderSummary map[summary:map[localDirectories:5 localDeleted:0 localBytes:143268 needDirectories:0 needDeletes:0 needBytes:0 invalid: globalSymlinks:0 globalDeleted:0 needFiles:0 inSyncBytes:143268 stateChanged:2018-07-12 09:20:32.701780007 +0000 UTC m=+202.825836477 version:97 pullErrors:0 globalDirectories:5 needSymlinks:0 localFiles:21 localSymlinks:0 inSyncFiles:21 state:idle sequence:97 ignorePatterns:false globalFiles:21 globalBytes:143268] folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx]
2018-07-12 09:20:32 stats.DeviceStatisticsReference.WasSeen: 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2
2018-07-12 09:20:32 Summary for folder "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" is map[localDirectories:5 inSyncBytes:143268 globalFiles:21 globalSymlinks:0 globalDirectories:5 localDeleted:0 pullErrors:0 needDirectories:0 globalDeleted:0 needFiles:0 localSymlinks:0 inSyncFiles:21 sequence:97 localBytes:143268 needDeletes:0 needBytes:0 version:97 localFiles:21 needSymlinks:0 globalBytes:143268 state:idle]
2018-07-12 09:20:32 concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx WithNeedTruncated(2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2)
2018-07-12 09:20:32 model@0xc420096fc0 Completion(2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2, "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx"): 100.000000 (0 / 143268 = 0.000000)
2018-07-12 09:20:32 log 95 FolderCompletion map[device:2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 completion:100 needBytes:0 needItems:0 globalBytes:143268 needDeletes:0 folder:concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx]
2018-07-12 09:20:32 Completion for folder "concrete-termite-eponymous-turtle-mlbench-coordinator-555447fdc7-q2ncx" on device 2AMD5RW-GVNA4P3-CGBIJH2-G6OECKY-73OEILD-LDEN5TX-PDJS7AQ-2DBL7Q2 is 100%

We can see it saying that it's updating the file /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/main/views.py so the new docker path of "/dind/docker" is working correctly.

But if I connect to the kube node that the pod is running on, while the path is correct, the file is still unchanged (last modified yesterday, content is unchanged as well):

ls -alh /dind/docker/overlay2/2c73c728b130d6eb4d923838a400ad6b51df9947946a0a7cd4c0e882efcd05c3/merged/app/code/main/
total 48K
drwxr-xr-x 1 root root 4.0K Jul 11 10:45 .
drwxr-xr-x 1 root root 4.0K Jul 11 20:11 ..
-rw-r--r-- 1 root root 0 Jul 6 12:02 init.py
drwxr-xr-x 1 root root 4.0K Jul 12 08:37 pycache
-rw-r--r-- 1 root root 63 Jul 6 12:02 admin.py
-rw-r--r-- 1 root root 83 Jul 6 12:02 apps.py
drwxr-xr-x 2 root root 4.0K Jul 6 12:02 migrations
-rw-r--r-- 1 root root 57 Jul 6 12:02 models.py
-rw-r--r-- 1 root root 60 Jul 6 12:02 tests.py
-rw-r--r-- 1 root root 169 Jul 11 10:45 urls.py
-rwxrwxrwx 1 root root 795 Jul 11 20:11 views.py

I can edit the file manually in the shell on the node, which does change the application code, i.e. the change persists on the node.

commented

This has to be something with how they're handling volume mounts. What happens if you roll the container after making a change (and seeing that change go through in the logs)?

What happens if you roll the container

Do you mean rollback? Or what do you mean with "roll"?

I can try it first thing tomorrow morning.

commented

Sorry, "restart". It's possible the filesystem changes are not being picked up after the file is changed on disk. Other than that, I'm kinda out of ideas.

Nothing in particular happens when I restart the container.

But I did some more digging and think I figured it out (well, the problem, not a solution).

I just changed the code file locally with one change (Adding test_source), waited for it to propagate, then connected to a shell inside the container and changed the same file, but with a different string (Test2), then I just ran grep for those strings in the /var/lib/docker/ folder of my host system.

Results:

[zenon@ZENON-PC docker]$ cd /var/lib/docker/ [zenon@ZENON-PC docker]$ sudo grep -r --include views.py 'Welcome!test_source!' overlay2/18e66711fe3374c4a29dcc1564692ae28bfb47e4daed4288714a39b350fab71b/diff/var/lib/docker/overlay2/e700956c8a5208553f4c03555f240cf992910f684bc063ac65333e6c3dbf48e6/merged/app/code/main/views.py: return HttpResponse("Welcome!test_source!") overlay2/18e66711fe3374c4a29dcc1564692ae28bfb47e4daed4288714a39b350fab71b/merged/var/lib/docker/overlay2/e700956c8a5208553f4c03555f240cf992910f684bc063ac65333e6c3dbf48e6/merged/app/code/main/views.py: return HttpResponse("Welcome!test_source!") [zenon@ZENON-PC docker]$ sudo grep -r --include views.py 'Welcome!Test2' volumes/kubeadm-dind-kube-node-1/_data/docker/overlay2/e700956c8a5208553f4c03555f240cf992910f684bc063ac65333e6c3dbf48e6/diff/app/code/main/views.py: return HttpResponse("Welcome!Test2")

So syncthing changes the respective entry in the overlay2 folder of the docker contents in the host overlay 2 folder, whereas changing the content directly changes the content of the overlay2 folder in the kubeadm-dind-kube-node-1 volume.

Hope this helps solve the problem.

commented

In that case it seems like something else is using the default graph driver path. I'll root around and see if I can find it, unless @grampelberg knows already.

Hello. I'm having this exact issue. Local files trigger "reloads" but nothing is change in the remote. And if I modify a file in the remote, that does not trigger nothing and I don't get the file locally.

I have a default installation of kubernetes and my "poc" is using nginx with a simple html website.