werf / actions

Set of actions for implementing CI/CD with werf and GitHub Actions

Home Page:https://werf.io/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Get ConfigMap werf-synchronization error

Serg046 opened this issue · comments

I am new to werf and its integration with GH. First, I just created simple Dockerfile, start.sh and werf.yaml as described in the werf docs. Then I created GH workflow yml according to the docs as well. It seems I get the image successfully built but then I get:

Error: unable to create lock manager: get ConfigMap werf-synchronization error: Get "https://127.0.0.1:6443/api/v1/namespaces/werf-guide-app-production/configmaps/werf-synchronization": dial tcp 127.0.0.1:6443: connect: connection refused
Error: The process '/home/runner/work/_temp/werf-tools/werf' failed with exit code 1

Since I am using everything from the official docs, I think the only place I can have a mistake is KUBE_CONFIG_BASE64_DATA GH secret. I've created it according to https://kubesandclouds.com/index.php/2020/09/01/werf-gitops/:

  1. kubectl create secret docker-registry regcred --docker-server=ghcr.io --docker-username=serg046 --docker-password=generate_token_here
  2. kubectl patch serviceaccount default -p '{"imagePullSecrets": [{"name": "regcred"}]}'
  3. kubectl config view --raw | base64
  4. Finally I put the result to the secret on GH

Could you please help to get progress?
Failing job: https://github.com/Serg046/serg046.github.io/runs/5267035872?check_suite_focus=true

Looks like you use the wrong config cause werf tries to communicate with k8s installed locally (dial tcp 127.0.0.1:6443) whereas you are not using self-hosted GitHub runners.

That was my first thought actually but it didn't help me to understand the issue as I use the files from the doc: Dockerfile, werf.yaml, prod.yml. So the files are really simple. The only customized thing is KUBE_CONFIG_BASE64_DATA but I am not sure if it can cause to such an issue...

I meant kubeconfig; the rest of the configuration is correct. It looks like you are using the configuration for a local minikube or something similar.

Yes, the config was wrong, thank you.