nicosang avatar image
nicosang asked Erick Ramirez answered

Issue updating Spark connector from 2.4.0 to 3.1.0


we have spark jobs in version 2.4.1 (scala 2.11). We use, for test pupose, a cassandra docker image (3.11.4). We use a 2.4.0 version of spark cassandra connector tor read and write in this cassandra table. We have a lot of functional tests that are correctly executed in this environment.

We 'd like to upgrade to the latest versions of spark : 3.1.0 (scala 2.12). Except, some little warnings in the scala code of our spark jobs, no problem to upgrade versions of the different elements. Unfortunately, we have some issues with the spark cassandra connector. First of all, on the write part of the process, we had Invalid row size: 20 instead of 19. We have been able to solve it by replacing the call to saveToCassandra by save withWriteTime We still have some troubles and are not able to read and write correctly in the cassandra table. I know there is a lot of parameters in the connector but do you have an idea of a parameter that has changed from 2.11 to 2.12 scala version of the cassandra Connector?

Any help will be appreciate,



10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

Erick Ramirez avatar image
Erick Ramirez answered

You can have a look at the list of changes between releases. But without much detail on what problems you're running into, it's difficult to give a meaningful answer.

But as a general response, you will need to rework/recompile your app so they work with Apache Spark 3.1 since your app written for Spark 2.4 will not be compatible with version 3.1 of the connector.

I'm going to reach out to the developers of the connector and ask them to respond to you directly. Cheers!

10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.