ktomk / pipelines

Pipelines - Run Bitbucket Pipelines Wherever They Dock

Home Page:https://ktomk.github.io/pipelines/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pipelines Fatal Environment Definition Error is puzzling

acederberg opened this issue · comments

Hi @ktomk! Sorry if I have raised too many issues. Hopefully this is not something caused by my configuration as it runs in bitbucket. I run pipelines in a repository with the following bitbucket configuration :

image: python:3.10

options :
  docker : true

definitions :
  services :
    mysql :
      variables :
        MYSQL_RANDOM_ROOT_PASSWORD : 'yes'
        MYSQL_DATABASE : 'mve_brain_sqlalchemy_tests'
        MYSQL_USER : 'adrian'
        MYSQL_PASSWORD : 'somepassword'
      image : mysql
  steps :
    - step : &create_env_test
        name : Create .env.test for testing since this file should never be commited.
        script :
          - export ENV_TEST_PATH=".env.test"
          - touch $ENV_TEST_PATH
          - echo BRAIN_MYSQL='{"database":"mve_brain_sqlalchemy_tests","host":"localhost","port":3306,"username":"adrian","password":"somepassword","drivername":"mysql+asyncmy"}' >> $ENV_TEST_PATH 
          - echo BRAIN_AUTH0='{"client_id":"$BRAIN_AUTH0__CLIENT_ID","client_secret":"$BRAIN_AUTH0__CLIENT_SECRET","secret_key":"$BRAIN_AUTH0__SECRET_KEY","audience":"http://localhost:8000/","issuer":"$BRAIN_AUTH0__ISSUER"}' >> $ENV_TEST_PATH
          - echo BRAIN_UVICORN='{"host":"0.0.0.0","port":8000,"reload":true}' >> $ENV_TEST_PATH
          - echo BRAIN_AUTHDUMMY='{"secret_key":"$BRAIN_AUTH_DUMMY_SECRET_KEY","client_secret":"$BRAIN_AUTH_DUMMY__CLIENT_SECRET","admin_secret":"$BRAIN_AUTH_DUMMY__ADMIN_SECRET","token_timeout":3600}' >> $ENV_TEST_PATH
          - echo BRAIN_API_INCLUDE_UNSAFE_ROUTES=1 >> $ENV_TEST_PATH
        artifacts :
          - .env.test

    - step : &invoke_tests
        name : Run tests for non-fetchers items.
        caches :
          - pip
        script :
          - pip install -r requirements.txt 
          - pip install -r requirements.dev.txt
          - BRAIN_API_PURPOSE=test python -m pytest
        services :
          - mysql
 
    - step : &build_containers
        name : Building prod and fetchers docker images (without an enironment files)
        caches :
          - pip
        script :
          - export ACR_URI="$ACR_NAME.azurecr.io"
          - export IMAGE_API="$ACR_URI/api:$BITBUCKET_COMMIT" 
          - export IMAGE_FETCHERS="$ACR_URI/fetchers:$BITBUCKET_COMMIT"
          - docker login $ACR_URI --username $BBSP_USERNAME --password $BBSP_PASSWORD

          - docker build -t "$IMAGE_API" -f "Dockerfile.prod" --target "prod" .
          - docker build -t "$IMAGE_FETCHERS" -f "Dockerfile.prod" --target "fetcher_runner" . 
          
          - docker run --name test1 --detach "$IMAGE_API" 
          - sleep 30 
          - docker stop test1
                    
          - docker push $IMAGE_API 
          - docker push $IMAGE_FETCHERS

    - step : &typecheck_source
        name : See if code passes mypy's type checking.
        caches :
          - pip
        script :
          - python -m pip install mypy sqlalchemy[mypy]
          - if ( python -m mypy . > results_mypy 2>&1 ); then echo 1; else echo 0; fi
        artifacts :
          - results_mypy

    - step : &lint_source
        name : See if code passes flake8. Try to autolint.
        caches :
          - pip 
        script :
          - pip install flake8 
          - if ( python -m flake8 . > results_flake8 2>&1 ); then echo 1; else echo 0; fi
        artifacts :
          - results_flake8

    - step  : &scan_source
        name : Check for secrets etc.
        caches : 
          - pip
        script :
          - pip install bandit
          - if ( bandit . > results_bandit 2>&1 ); then echo 1; else echo 0; fi
        artifacts :
          - results_bandit

  basic : &basic
    - step : 
        <<: *create_env_test
    - step :
        <<: *invoke_tests

  everything : &everything

    - parallel :
      - step : 
          <<: *scan_source
      - step :
          <<: *lint_source
      - step :
          <<: *typecheck_source
    - step : 
        <<: *create_env_test
    - step :
        <<: *invoke_tests

  everythingandbuild  : &everythingandbuild

    - parallel :
      - step : 
          <<: *scan_source
      - step :
          <<: *lint_source
      - step :
          <<: *typecheck_source
    - step : 
        <<: *create_env_test
    - step :
        <<: *invoke_tests
    - step :
        <<: *build_containers

pipelines:
  default : *basic
  branches :
    master : *everythingandbuild
    refactoring :  *everythingandbuild

  pull-requests : 
    '**' : *everything

But I get the following issue when I run 'pipelines':

pipelines: fatal: Variable definition error: '  "database" : "mve_brain_sqlalchemy_tests",'

I believe this to be a yaml parsing error, due to the dictionary/json like structure. I pushed my changes to bitbucket and everything worked correctly. I am using pipelines 0.0.35 ( EDIT: THIS IS WRONG, I AM USING 0.65 ) and installed it onto WSL2 ubuntu using composer version 2.3.4.

Thanks again for your help @ktomk! This project is very helpful to my productivity.

I am using pipelines 0.0.35

Please try with 0.0.65 @acederberg

Please try with 0.0.65 @acederberg

@ktomk
That was a typo on my part while using the 10 key on keyboard. I am using 0.65

@acederberg was the command exactly pipelines? or something else? and can you share the PHP version you have in WSL? (output of php --version should suffice). Right now I can't reproduce.

@acederberg: Can you provoke the error while having --debug in the pipelines command line? that should give a backtrace.

I believe this to be a yaml parsing error, due to the dictionary/json like structure.

I'm not so sure.

pipelines: fatal: Variable definition error: ' "database" : "mve_brain_sqlalchemy_tests",'

is related to command-line argument parsing for --env-file (variable definitions inside and env-file) and --env, -e (single variable definitions).

Perhaps a .env or .env.dist is automatically loaded which you don't share (naturally) with the report.

Therefore also please try with --no-dot-env-files to prevent automatically loading such files and see how it goes.

Next version will show that it is an Environment Variable Error and if caused by by an --env-file parameter - either explicit or implicit - the path of the file and the line number on which it occurred.

This is in the test branch right now.

I'm pretty sure it is an .env file that is causing this, at least I could reproduce the error message with a broken .env file.

Hi @ktomk. Sorry for the late response. Let me try to answer your questions :

My php version in wsl is 7.4.3.

Running pipelines --debug command produces the following :

pipelines: fatal: Variable definition error: '  "database" : "mve_brain_sqlalchemy_tests",'
pipelines: version 0.0.65-composer w/ php 7.4.3 (libyaml: n/a)
--------
class....: InvalidArgumentException
message..: Variable definition error: ' "database" : "mve_brain_sqlalchemy_tests",'
code.....: 0
file.....: /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Runner/EnvResolver.php
line.....: 156
backtrace:
#0 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Runner/EnvResolver.php(137): Ktomk\Pipelines\Runner\EnvResolver->addDefinition()
#1 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Runner/EnvResolver.php(103): Ktomk\Pipelines\Runner\EnvResolver->addLines()
#2 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Runner/EnvResolver.php(121): Ktomk\Pipelines\Runner\EnvResolver->addFile()
#3 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Runner/Env.php(329): Ktomk\Pipelines\Runner\EnvResolver->addFileIfExists()
#4 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Utility/EnvParser.php(68): Ktomk\Pipelines\Runner\Env->collectFiles()
#5 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Utility/App.php(146): Ktomk\Pipelines\Utility\EnvParser->parse()
#6 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Utility/ExceptionHandler.php(50): Ktomk\Pipelines\Utility\App->run()
#7 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Utility/ExceptionHandler.php(65): Ktomk\Pipelines\Utility\ExceptionHandler->handle()
#8 /home/adr1an/.config/composer/vendor/ktomk/pipelines/src/Utility/App.php(90): Ktomk\Pipelines\Utility\ExceptionHandler->handleStatus()
#9 /home/adr1an/.config/composer/vendor/ktomk/pipelines/bin/pipelines(37): Ktomk\Pipelines\Utility\App->main()
#10 /home/adr1an/.config/composer/vendor/bin/pipelines(112): include('/home/adr1an/.c...')
#11 {main}
--------

I believe that it is entirely possible that what you describe is occuring, I have a number of .env files in the repository that define env variables as json for pydantic.

Thank you again @ktomk,
This tool is awesome,

Adrian

Can you elaborate a bit on that format of pydantic?

One benefit we could have here to allow more formats than docker itself and make this automatically compatible.

Should these .env files be automatically sourced in the pipeline environment in the first place? @acederberg

/edit: and yes the backtrace shows it is exactly that problem.

@ktomk I think the .env files should be sourced if that is what bitbucket does, I did not consider it.

Pydantic will allow you to define nested settings, e.g.

from pydantic import BaseModel, BaseSettings

class MyField(BaseModel) :

  myFirstField : str
  mySecondField : int


class Settings(BaseSettings ) :
  """Settings parser.

  The nested field will be specified as JSON.
  """
  
  someThing : str
  someThingElse : int
  someThingNested : MyField

  class Config :
    env_file = '.env'

This would allow you to write the following configuration file

# .env
someThing="This will be parsed as a string"
someThingElse=1
someThingNested='{
  "myFirstField" : "This will also be parsed as a string",
  "mySecondField" : 2
}'

I am in a rush and did not test this, but that explains it roughly. Read more here.

I think the .env files should be sourced if that is what bitbucket does, I did not consider it.

Okay then not, as Bitbucket does not source it, it's a convention with the pipelines utility. Use the --no-dot-env-files command-line switch to not run into that error as it skips loading these files.

Thanks for the info about the format in your .env file. It wouldn't make sense to include it nevertheless, but it's good to know about it. It (Python-dotenv format) is incompatible with dockers format (to which pipelines adheres to) for the multiline values (1/3), Variable without a value (2/3) and Variable expansion (3/3). Using it with docker would be especially breaking with the second, variable without a value. On first review I'd say it's not possible here to make it compatible (and also it would be wrong to pass it into the pipeline containers environment, the utility running inside reads it).

So this is a bit of a (user-solvable by --no-dot-env-files) clash between pipelines and a project setup. The best idea I have is to not auto-take that file if it errors. On the other hand if it was intended to include and it errors it would be wrong to skip it. Looks like there is no straight forward answer here.

If you have any suggestion let me know. Also if the switch is not of use for you.

version 0.0.67 is out and it contains better error messages with the .env files (showing the filename).

it is my understanding that with the better error message this is less puzzling.

I'll close for now, thanks for reporting this one!