spirom / spark-mongodb-connector

A prototype native MongoDB connector for Apache Spark, using Spark's external datasource API

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

All configuration must be available via OPTIONS in Spark SQL's CREATE TEMPORARY TABLE

spirom opened this issue · comments

Much of NSMC's configuration uses SparkContext properties, but these don't work over, say, a JDBC connector when the context is not easily accessible, and in anyc ase they make it hard to use a diverse set of MongoDB instances and even collections.

Spark SQL integration needs to be extended to allow all of this configuration to be passed into the OPTIONS clause of CREATE TEMPORARY TABLE, etc, one a per-collection basis.