Hi guys,
I'm having an issue when I use COPY from .csv
Starting copy of employee_by_id with columns [company_id, employee_id, access_code, access_code_expiration_date_time, avatar_url, birth_date, current_employments, email, external_code, external_id, first_name, last_name, minor, modified_date, past_employments, punch_badge_id, seniority_date, status, username, wj_admin]. <stdin>:1:Failed to import 2000 rows: Error - field larger than field limit (131072), given up after 1 attempts <stdin>:1:Exceeded maximum number of insert errors 1000 <stdin>:1:Failed to process 2000 rows; failed rows written to import_employee_by_id.err <stdin>:1:Exceeded maximum number of insert errors 1000
I already modified the file both files below ( I was not sure why it had this cqlshrc.default, so I changed both)
-rwxrwxrwx 1 cassandra cassandra 1418 Mar 19 16:41 cqlsh -rwxr-xr-x 1 cassandra cassandra 1418 Mar 19 16:41 cqlshrc.default
and added
[csv] field_size_limit = 1000000000
But the error still happening... what could it be?
Thank you