Java API

In addition to the primary Scala API, the connector provides convenience APIs when accessed from Java.

Java APIs

To use the Java API in spark, you need to initialize a JavaSparkContext:

SparkConf conf = new SparkConf()
    .setAppName("javaSample")
    .setMaster("local[*]")
    .set("com.couchbase.bucket.travel-sample", "");

JavaSparkContext sc = new JavaSparkContext(conf);

Since Java doesn’t have the implicit imports like Scala, the connector provides a helper class to achieve similar functionality:

// The Couchbase-Enabled spark context
CouchbaseSparkContext csc = couchbaseContext(sc);

The context is a static import. In general you want to statically import the following:

import static com.couchbase.spark.japi.CouchbaseDocumentRDD.couchbaseDocumentRDD;
import static com.couchbase.spark.japi.CouchbaseSparkContext.couchbaseContext;

Now you can create RDDs through Key/Value, Views or N1QL:

// Load docs through K/V
List<JsonDocument> docs = csc
    .couchbaseGet(Arrays.asList("airline_10226", "airline_10748"))
    .collect();

System.out.println(docs);
// Perform a N1QL query
List<CouchbaseQueryRow> results = csc
    .couchbaseQuery(N1qlQuery.simple("SELECT * FROM `travel-sample` LIMIT 10"))
    .collect();

System.out.println(results);

If you want to store Documents in Couchbase, use the couchbaseDocumentRDD method:

couchbaseDocumentRDD(
    sc.parallelize(Arrays.asList(JsonDocument.create("doc1", JsonObject.empty())))
).saveToCouchbase();