DataStax Academy FAQ

DataStax Academy migrated to a new learning management system (LMS) in July 2020. We are also moving to a new Cassandra Certification process so there are changes to exam bookings, voucher system and issuing of certificates.

Check out the Academy FAQ pages for answers to your questions:


question

Akanksha avatar image
Akanksha asked ·

Which version of the Spark connector is compatible with Spark 2.4 and Scala 2.11.12 and ScyllaDB?

I am New to Spark +ScyllaDB , I want to have a write stream to write to the ScyllaDB, but my versions are as:

ScyllaDB Cassandra : 3.0.8

Scala - 2.12.11

Spark - 3.0.0

Please tell me which Spark-cassandra-connector to be used

spark-cassandra-connectorscylladb
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

Erick Ramirez avatar image
Erick Ramirez answered ·

The versions you provided in the title of your question contradict the versions you posted in the description section so it appears you're not sure of what you're asking.

The latest production version of the Spark Cassandra Connector is v2.5.1. As per the documentation, it is compatible with the following:

  • Apache Cassandra 2.1.5 or newer
  • Apache Spark 1.0 to 2.4
  • Scala 2.11 and 2.12

There is a preview release of the Connector which is currently in beta version v3.0.0-beta. It is compatible with the following:

  • Apache Cassandra 2.1.5 or newer
  • Apache Spark 3.0
  • Scala 2.12 only

To be clear, we don't test against any version of ScyllaDB and we are aware that users run into various issues. We are also aware that the folks behind ScyllaDB have forked the Spark connector and maintain it separately. Feel free to reach out to them directly if you have any follow up questions about ScyllaDB or their forked products. Cheers!

2 comments Share
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Sorry Sir, Actaully I was trying to find the solution for both the combination and I got the solution by:

resolvers += "Spark Packages Repo" at "https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector"
"com.datastax.spark" %% "spark-cassandra-connector" % "3.0.0-beta"

for

scalaVersion := "2.12.11"

val sparkVersion= "3.0.0"

Cassandra :  3.0.8
0 Likes 0 · ·

Again, we know you'll run into problems because ScyllaDB isn't supported. It is also a fork of Apache Cassandra. Cheers!

0 Likes 0 · ·