Hi , I am using Scala version : 2.11 Spark Cassandra connector : 2.3.0
I have a table like :
CREATE TABLE dcmapp ( t text, p text, v double, d blob, f text, i tinyint, PRIMARY KEY ((t, p), v ) )
The above contains a 2 column partition key. When I want to fetch rows based on column "t" which is part of partition key like :
sc.cassandraTable("kevin", "dcmapp").select("t", "p").where("t = ?", "A1").collect().foreach(println)
I get the following error:
java.lang.UnsupportedOperationException: Partition key predicate must include all partition key columns or partition key columns need to be indexed. Missing columns: p
If I query by i , the query is successful.
Is it not possible to query in connector using partial partition key.
If I trigger the same query via cqlsh specifying only column "t" in where clause and ALLOW FILTERING i get results