Environment:
spark 3.1.1 cassandra 3.11.4 connector 3.0.1
I'm unable to access any materialized view that used to work on v2.4.2 of the connector. Here are the steps to reproduce the behavior, what am I missing?
spark-shell --conf spark.cassandra.connection.host=<> --conf "spark.cassandra.auth.username=<>" --conf "spark.cassandra.auth.password=<>" --conf "spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions" --packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.1 --conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions
Also tried launching the shell with local dependencies, with identical results.
spark-shell --conf "spark.cassandra.connection.host=<>" --conf "spark.cassandra.auth.username=<>" --conf "spark.cassandra.auth.password=<>" --jars ${HOME}/lib/kafka-clients-2.7.0.jar,${HOME}/lib/spark-cassandra-connector_2.12-3.0.0.jar,${HOME}/lib/spark-cassandra-connector-driver_2.12-3.0.0.jar,${HOME}/lib/java-driver-core-shaded-4.7.2.jar,${HOME}/lib/java-driver-shaded-guava-25.1-jre-graal-sub-1.jar,${HOME}/lib/config-1.3.4.jar,${HOME}/lib/native-protocol-1.4.10.jar,${HOME}/lib/reactive-streams-1.0.2.jar,${HOME}/lib/jnr-posix-3.1.5.jar
Script:
import org.apache.spark.sql.cassandra._ import org.apache.spark.sql._ spark.read.cassandraFormat("mymview", "mykeyspace").load()
Error:
java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.analysis.NoSuchTableException: method <init>(Ljava/lang/String;)V not found at com.datastax.spark.connector.datasource.CassandraCatalog$.tableMissing(CassandraCatalog.scala:462) at com.datastax.spark.connector.datasource.CassandraCatalog$.$anonfun$getTableMetaData$2(CassandraCatalog.scala:425) at java.util.Optional.orElseThrow(Optional.java:290) at com.datastax.spark.connector.datasource.CassandraCatalog$.getTableMetaData(CassandraCatalog.scala:425) at org.apache.spark.sql.cassandra.DefaultSource.getTable(DefaultSource.scala:68) at org.apache.spark.sql.cassandra.DefaultSource.inferSchema(DefaultSource.scala:72) at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:81) at org.apache.spark.sql.DataFrameReader.$anonfun$load$1(DataFrameReader.scala:296) at scala.Option.map(Option.scala:230) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:266) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:226) ... 51 elided