I get CodecNotFoundException when writing to a blob field inside a UDT using Spark 3.1.2, spark-cassandra-connector 3.1.0 and Cassandra 3.11.10.
I have a UDT:
CREATE TYPE testks.bar ( bin blob );
And a table:
CREATE TABLE testks.foo ( pk text PRIMARY KEY, b frozen<bar> )
When I try inserting a new row to foo with non null data for field bin inside UDT bar using spark-cassandra-connector 3.1.0 from spark-shell (3.1.2) write fails and I get exception:
com.datastax.oss.driver.api.core.type.codec.CodecNotFoundException: Codec not found for requested operation: [BLOB <-> java.nio.HeapByteBuffer]
I tried using RDD and Dataset APIs and got the same exception with both.
If I use table:
CREATE TABLE testks.foo2 ( pk text PRIMARY KEY, bin blob )
I can insert rows with no issues. Reading data to Spark from foo and foo2 also works.
It seems strange that codec BLOB <-> java.nio.HeapByteBuffer can't be found when writing to blob inside UDT since I'd assume the same codec is used when reading blobs and writing to blob field that is not inside a UDT. Any ideas?