Spark MongoDB Write Error - com.mongodb.MongoBulkWriteException: Bulk write operation error on server 'E11000 duplicate key error collection:'
One may see following error or exception, while running Spark 2.4 with - mongo-spark-connector_2.11-2.4.0.jar mongo-java-driver-3.9.0.jar Exception - User class threw exception: org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 6.0 failed 4 times, most recent failure: Lost task 2.3 in stage 6.0 (TID 238, nc0020.hadoop.mycluster.com, executor 2): com.mongodb.MongoBulkWriteException: Bulk write operation error on server vondbd0008.mymachine.com:27017. Write errors: [BulkWriteError{index=0, code=11000, message='E11000 duplicate key error collection: POC1_DB.MyCollection index: _id_ dup key: { _id: "113442141" }', details={ }}]. at com.mongodb.connection.BulkWriteBatchCombiner.getError(BulkWriteBatchCombiner.java:177) at com.mongodb.connection.BulkWriteBatchCombiner.throwOnError(BulkWriteBatchCombiner.java:206) at com.mongodb.connection.BulkWriteBatchCombiner.getResult(BulkWriteBatchCombiner.java:147) at com.mongodb.operation.BulkWrite...