Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started



srinu.gajjala321_68185 avatar image
srinu.gajjala321_68185 asked Erick Ramirez edited

Passing cluster config in spark cassandra connector while using rdd


I'm trying to set row level ttl on a table as dataframe api doesn't support row level ttl, I'm using rdd's to set the ttl. I'm trying to connect to a cluster and set the cluster config like host address, username, password etc in the spark conf./ and when I try to connect to the cluster it's not honoring the cluster which is set in spark conf.

This is what I'm using

rdd.saveToCassandra(keyspace, table, writeConf = WriteConf(ttl = TTLOption.perRow(ttlColumn)))

and is there a way I can pass the whole spark conf as this write along with the ttl.

Any help would be appreciated.


10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

Russell Spitzer avatar image
Russell Spitzer answered srinu.gajjala321_68185 commented

This is possible. In the Scala API the saveToCassandra method has an implicit parameter "connector". But we can always set this implicit parameter explicitly or set an implicit connector for an operation.

An example of this is in my blog post

Which shows how we can make a code block whose implicit connector is different than the rest of the code.

    //Sets connectorToClusterTwo as the default connection for everything in this code block
    implicit val c = connectorToClusterTwo

Whatever Connector is set as implicit within the scope of the saveToCassandra will be used.

1 comment Share
10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Thanks for the reply Russell and How can I pass all the conf like SSL config, username, password etc. I tried to pass like this:

val connectorToClusterOne = CassandraConnector(sparkSession.sparkContext.getConf)

and It's not accepting it still.

0 Likes 0 ·