Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

jaliase avatar image
jaliase asked ·

Frozen list not being Unset by the Spark connector

We have a column in a table with frozen<list<text>>. We are updating the value of the column in a spark job however if the list object is null in the output rdd, the spark connector sets the value to [] instead of unsettling it. We have a global configuration to unset when null.

spark-cassandra-connector
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

jaroslaw.grabowski_50515 avatar image
jaroslaw.grabowski_50515 answered ·

Please check the row via cqlsh. I suspect that the row has the value unset and "[]" that you see comes from Spark Cassandra Connector read (this is how SCC represents empty lists)).

Share
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.