Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

info_146038 avatar image
info_146038 asked info_146038 answered

DSBulk with unicode and query in cqlsh with unicode

I have data in Cassandra where the primary key (one column table_key) contains a key and a value and those are separated by a file separator ' x1c'' unicode. How can I query with this in the where clause of cqlsh? The data gets inserted via an Java application normally.


When my where clause is WHERE table_key = 'key\x1cvalue' it just finds nothing, while the record is in there.


Same is that I want to bulk load data with this same separator via DSBulk, but in some way it does not recognize it and sees it as text only.

dsbulk
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

info_146038 avatar image
info_146038 answered

Loading the data using DSBulk with ASCII sign 28 in the CSV worked perfectly fine, now it was not seen as text anymore, but really as a file separator.

Share
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.