Hi all,
I am trying to migrate data from sqlite3 to cassandra. The flow is that I am creating csv out of sqlite3 data and uploading that csv to cassandra using dsbulk loader. I have made sure the schema of sqlite3 is compatible with cassandra.
The scenarioi is that in the sqlite3 , i have a column with FLOAT datatype and then converting that sqlite3 files with the specified FLOAT column to csv . The resulted csv has one of the column value (FLOAT) in that as 1599.6000000000001 . On using this csv in dsbulk loader, i am getting this exception,
java.lang.ArithmeticException: Cannot convert 1599.6000000000001 from BigDecimal to Float at com.datastax.oss.dsbulk.codecs.api.util.CodecUtils.conversionFailed(CodecUtils.java:602) at com.datastax.oss.dsbulk.codecs.api.util.CodecUtils.toFloatValueExact(CodecUtils.java:529) at com.datastax.oss.dsbulk.codecs.api.util.CodecUtils.convertNumber(CodecUtils.java:325) at com.datastax.oss.dsbulk.codecs.api.util.CodecUtils.narrowNumber(CodecUtils.java:183) at com.datastax.oss.dsbulk.codecs.text.string.StringToNumberCodec.narrowNumber(StringToNumberCodec.java:91) at com.datastax.oss.dsbulk.codecs.text.string.StringToFloatCodec.externalToInternal(StringToFloatCodec.java:66) at com.datastax.oss.dsbulk.codecs.text.string.StringToFloatCodec.externalToInternal(StringToFloatCodec.java:33) at com.datastax.oss.dsbulk.codecs.api.ConvertingCodec.encode(ConvertingCodec.java:70) at com.datastax.oss.dsbulk.workflow.commons.schema.DefaultRecordMapper.bindColumn(DefaultRecordMapper.java:160) at com.datastax.oss.dsbulk.workflow.commons.schema.DefaultRecordMapper.map(DefaultRecordMapper.java:134)
The column of sqlite3 FLOAT is mapped in cassandra as FLOAT ( in the cassandra schema) too.
Why is this happening? Will Cassandra automatically convert the values accordingly? How to avoid this without changing the datatype?
Any help would be appreciated
Thanks