we have spark jobs in version 2.4.1 (scala 2.11). We use, for test pupose, a cassandra docker image (3.11.4). We use a 2.4.0 version of spark cassandra connector tor read and write in this cassandra table. We have a lot of functional tests that are correctly executed in this environment.
We 'd like to upgrade to the latest versions of spark : 3.1.0 (scala 2.12). Except, some little warnings in the scala code of our spark jobs, no problem to upgrade versions of the different elements. Unfortunately, we have some issues with the spark cassandra connector. First of all, on the write part of the process, we had Invalid row size: 20 instead of 19. We have been able to solve it by replacing the call to saveToCassandra by save withWriteTime We still have some troubles and are not able to read and write correctly in the cassandra table. I know there is a lot of parameters in the connector but do you have an idea of a parameter that has changed from 2.11 to 2.12 scala version of the cassandra Connector?
Any help will be appreciate,