The official MongoDB Spark Connector.
See: https://docs.mongodb.com/spark-connector/
The connector is published on Spark packages, the community index of third-party packages for Apache Spark. The binaries and dependency information for Maven, SBT, Ivy, and others can also be found on Maven Central.
For issues with, questions about, or feedback for the MongoDB Spark Connector, please look into our support channels. Please do not email any of the developers directly with issues or questions - you're more likely to get an answer on the mongodb-user discussion forum.
At a minimum, please include in your description the exact version of the driver that you are using. If you are having connectivity issues, it's often also useful to paste in the Spark configuration. You should also check your application logs for any connectivity-related exceptions and post those as well.
Think you’ve found a bug? Want to see a new feature in the Spark driver? Please open a case in our issue management tool, JIRA:
- Create an account and login.
- Navigate to the SPARK project.
- Click Create Issue - Please provide as much information as possible about the issue type and how to reproduce it.
Bug reports in JIRA for the driver and the Core Server (i.e. SERVER) project are public.
If you’ve identified a security vulnerability in a driver or any other MongoDB project, please report it according to the instructions here.
The MongoDB Spark Connector does not follow semantic versioning. The MongoDB Spark Connector version relates to the version of Spark.
For example:
- Mongo Spark Connector 2.1.x supports Spark 2.1.x
- Mongo Spark Connector 2.2.x supports Spark 2.2.x
Major changes may occur between point releases may occur, such as new APIs and updating the underlying Java driver to support new features. See the changelog for information about changes between releases.
Please see the downloading instructions for information on getting and using the MongoDB Spark Connector.
To build the driver:
$ git clone https://github.com/mongodb/mongo-spark.git
$ cd mongo-spark
$ ./sbt check
To publish the signed jars - first commit and tag all changes to publish.
$ ./sbt +publishArchives
To publish to spark packages use the form on https://spark-packages.org/package/mongodb/mongo-spark
.
No need for a zip file.
- Ross Lawley ross@mongodb.com
Additional contributors can be found here.