PrefectHQ / prefect-dbt

Collection of Prefect integrations for working with dbt with your Prefect flows.

Home Page:https://prefecthq.github.io/prefect-dbt/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Add the capability to use the Pod service account instead of using .gcp credential file

lucienfregosibodyguard opened this issue · comments

Hi again,

It would be awesome if instead of reading the .gcp json file with the credentials it would be possible to use the service account associated with the kubernetes pod.

Like we do for the GCP python client we don't need to write/specify any json file.

Can help on this point if you need it.

Thanks

Thanks for the suggestion! We would appreciate any help implementing, or simply pointers on how to do so!

I guess if it's possible to not set any gcp identity through the Python code, it will take the current GCP identity provided by the service account pod like other gcp clients like bigquery for example !

Thanks for the suggestion. Do you have code snippets by any chance?

client = bigquery.Client()

query = """
    SELECT name, SUM(number) as total_people
    FROM `bigquery-public-data.usa_names.usa_1910_2013`
    WHERE state = 'TX'
    GROUP BY name, state
    ORDER BY total_people DESC
    LIMIT 20
"""
query_job = client.query(query)  # Make an API request.

print("The query data:")
for row in query_job:
    # Row values can be accessed by field name or index.
    print("name={}, count={}".format(row[0], row["total_people"]))

Here for example we don't use the .gcp crendentials file, we use the current gpc profile loaded where the script runs (and it would be the same case within a kubernetes pod)

If I'm not mistaken, I think you can initialize GcpCredentials without service_account_file or service_account_info, e.g. credentials=GcpCredentials(), and it would infer from the local env. Let me know if that works.

I guess I was mistaken; the output profile wouldn't contain those info without a credentials specified. The PR above should fix it.