question

tomas avatar image
tomas asked Erick Ramirez edited

COPY TO command is reporting "ExportProcess.write_rows_to_csv(): writing row" for each exported row

Hello,

I was asked to provide export to CSV from one of the tables from DataStax. I decided to use COPY TO command Exactly:

cqlsh 192.168.1.10
copy keyspace.table to 'keyspace.table_20210426.csv';

For each exported line I am getting following debug message (1000 rows in table, 1000 messages like this during COPY TO in cqlsh> ):

...
cqlshlib.copyutil.ExportProcess.write_rows_to_csv(): writing row
cqlshlib.copyutil.ExportProcess.write_rows_to_csv(): writing row
cqlshlib.copyutil.ExportProcess.write_rows_to_csv(): writing row
...

We have multiple DataStax clusters, but only this one is showing such debug info for each line exported. Can you please advise where to look for to suppress such informational messages while COPY TO is running?

Thank you.

cqlsh
10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

Erick Ramirez avatar image
Erick Ramirez answered Erick Ramirez edited

When Python 3 support was added to Cassandra (DB-4151), a print() command added for debugging during development was inadvertently left behind in the code.

This has been addressed in newer versions of DataStax Enterprise and exporting data with the COPY command is now less noisy in DSE 6.8.11+, 6.7.13+, the yet-to-be released 6.0.16 and 5.1.23 (DSP-21494).

As a side note, the COPY command is designed for development use only for working with a few hundred records. We recommend you instead use the DataStax Bulk Loader tool (DSBulk).

DSBulk supports exporting/loading data in CSV or JSON format as well as counting records. It is fully open-source and works with DSE, Apache Cassandra and Astra. For examples of exporting data, see the Bulk Loader - Unloading blog post. Cheers!

Share
10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.