sahava / multisite-lighthouse-gcp

Run Lighthouse audits on URLs, and write the results daily into a BigQuery table.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Write integration tests

sahava opened this issue · comments

Write an integration test that verifies the payload being written to BQ is valid NDJSON and conforms to BigQuery schema.

In addition: If too many sites are in the config, then the BQ Upload Jobs fail due to too many concurrent jobs in a short time period.

--> If the table is setup to be partitioned, this increased the limit by a large margin, but does not totally prevent the issue.

Instead, I suspect that if even 1 second can be delayed (sleep) between each process starting, this would likely solve the issue.