Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

ayman92140_21632 avatar image
ayman92140_21632 asked ·

Error with CqlWhereParser using Cassandra connector

Hello


I am trying to use spark Cassandra connector to load some data from a table like in the following:


  val conf = new SparkConf(true)
      .set("spark.cassandra.connection.host", "localhost")
      .set("spark.cassandra.auth.username", "test")
      .set("spark.cassandra.auth.password", "test")

    val spark = SparkSession.builder
      .appName("Datastax Scala example")
      .master("local[*]")
      .config(conf)
      .getOrCreate

   val rdd = spark.sparkContext.cassandraTable("test", "orders")
   rdd.where("id=?", 123).collect.foreach(println)


By I am getting this exception:


Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.CommandLineWrapper.main(CommandLineWrapper.java:66)
Caused by: java.lang.NoSuchMethodError: scala.util.parsing.combinator.Parsers.$init$(Lscala/util/parsing/combinator/Parsers;)V
at com.datastax.spark.connector.util.CqlWhereParser$.<init>(CqlWhereParser.scala:6)
at com.datastax.spark.connector.util.CqlWhereParser$.<clinit>(CqlWhereParser.scala)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.$anonfun$containsPartitionKey$2(CassandraTableScanRDD.scala:410)
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:241)
at scala.collection.immutable.List.foreach(List.scala:389)
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:241)
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:238)
at scala.collection.immutable.List.flatMap(List.scala:352)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.containsPartitionKey(CassandraTableScanRDD.scala:410)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.partitionGenerator$lzycompute(CassandraTableScanRDD.scala:224)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.partitionGenerator(CassandraTableScanRDD.scala:223)
at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:272)
at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:253)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:251)


Without the where("id=?", 123), it is working fine.

In my maven pom, I have these dependencies:


       <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
            <version>2.4.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>2.4.2</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.12</artifactId>
            <version>2.4.2</version>
        </dependency>


am I missing something?



sparkconnector
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

0 Answers