relferreira / metabase-sparksql-databricks-driver

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Databricks custom metastore not found

LucienJm opened this issue · comments

Hello,

I have a Databricks SQL Serverless running with a custom metastore using Unity Catalog, metastore's name is "my_metastore".
I have a catalog named "my_catalog" and a schema named "my_schema" (attached to "my_catalog" where my tables are).

Using this driver, I tried to connect Metabase to Databricks to get tables in my "my_schema" schema.

My JDBC spark url looks like this:

jdbc:spark://xxx.cloud.databricks.com:xxx/my_metastore.my_catalog.my_schema;transportMode=http;ssl=1;AuthMech=3;httpPath=/sql/1.0/warehouses/xxx;UID=xxx;PWD=xxx

I've tried the following pattern to get data from my schema:
jdbc:spark://xxx.cloud.databricks.com:xxx/my_metastore.my_catalog.my_schema;...
jdbc:spark://xxx.cloud.databricks.com:xxx/my_catalog.my_schema;...
jdbc:spark://xxx.cloud.databricks.com:xxx/my_schema;...

Everytime I have the same exception in the logs
Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchCatalogException: Catalog 'my_metastore' not found

When I set no specific metastore nor catalog nor schema, it look at "hive_metastore" (the default one) and it works.
But I want to query data from "my_metastore" not "hive_metastore".

Do you have any idea why please ?

Thank you very much

Hello,

I managed to fix it by adding

  • ConnCatalog=my_catalog
  • ConnSchema=my_schema

In the JDBC string.