Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

vkrot avatar image
vkrot asked Erick Ramirez answered

DSBulk reports "Can't get more results because the continuous query has failed already"

Hi,

I need to load a table of 5TB size into csv, but I constantly get an error from dsbulk:

Can't get more results because the continuous query has failed already. Most likely this is because the query was cancelled

This is a single-node cluster, I created it and imported 5TB snapshot just for the sake of data export. So the node runs only DSE 6.8.13 and dsbulk.

This is how I start dsbulk:

/bin/dsbulk unload -h localhost -k keyspace -t cf -url /mnt/export --monitoring.csv=true --monitoring.console=true --connector.csv.maxRecords=100000000

After some hours it fails with forementioned error. Cassandra logs show nothing at the moment when it failed.

Any ideas?

stack trace:

Statement: com.datastax.oss.driver.internal.core.cql.DefaultBoundStatement@1d07c8d6 [2 values, idempotence: <UNSET>, CL: <UNSET>, serial CL: <UNSET>, timestamp: <UNSET>, timeout: <UNSET>]
SELECT device_id, data_item_id, bucket_time, data_time, data_value FROM myplant.ts_data_double WHERE token(device_id, data_item_id, bucket_time) > :start AND token(device_id, data_item_id, bucket_time) <= :end
start: -4035225266123964417
end: -3891110078048108545
com.datastax.oss.dsbulk.executor.api.exception.BulkExecutionException: Statement execution failed: SELECT device_id, data_item_id, bucket_time, data_time, data_value FROM myplant.ts_data_double WHERE token(device_id, data_item_id, bucket_time) > :start AND token(device_id, data_item_id, bucket_time) <= :end (Can't get more results because the continuous query has failed already. Most likely this is because the query was cancelled)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.toErrorPage(ResultSubscription.java:534)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.lambda$fetchNextPage$1(ResultSubscription.java:372)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.fetchNextPage(ResultSubscription.java:362) [4 skipped]
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.lambda$fetchNextPage$3(ResultSubscription.java:385)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.dequeue(ResultSubscription.java:443) [4 skipped]
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.tryNext(ResultSubscription.java:302)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.drain(ResultSubscription.java:254)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.request(ResultSubscription.java:202)
        at java.lang.Thread.run(Thread.java:748) [10 skipped]
Caused by: java.util.concurrent.CancellationException: Can't get more results because the continuous query has failed already. Most likely this is because the query was cancelled
        at com.datastax.dse.driver.internal.core.cql.continuous.ContinuousRequestHandlerBase$NodeResponseCallback.cancelledResultSetFuture(ContinuousRequestHandlerBase.java:1545)
        at com.datastax.dse.driver.internal.core.cql.continuous.ContinuousRequestHandlerBase$NodeResponseCallback.dequeueOrCreatePending(ContinuousRequestHandlerBase.java:1237)
        at com.datastax.dse.driver.internal.core.cql.continuous.ContinuousRequestHandlerBase.lambda$fetchNextPage$2(ContinuousRequestHandlerBase.java:305)
        at com.datastax.dse.driver.internal.core.cql.continuous.ContinuousRequestHandlerBase.fetchNextPage(ContinuousRequestHandlerBase.java:299) [3 skipped]
        at com.datastax.dse.driver.internal.core.cql.continuous.DefaultContinuousAsyncResultSet.fetchNextPage(DefaultContinuousAsyncResultSet.java:102)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription$Page.nextPage(ResultSubscription.java:578)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.fetchNextPage(ResultSubscription.java:346)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.lambda$fetchNextPage$3(ResultSubscription.java:385)
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.dequeue(ResultSubscription.java:443) [4 skipped]
        at com.datastax.oss.dsbulk.executor.api.subscription.ResultSubscription.tryNext(ResultSubscription.java:302)
dsbulk
10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

Erick Ramirez avatar image
Erick Ramirez answered

DSBulk has an embedded Java driver which connects to the cluster. This exception indicates that the request to the cluster failed because the query was cancelled:

Caused by: java.util.concurrent.CancellationException: Can't get more results because the continuous query has failed already. Most likely this is because the query was cancelled

It isn't possible to determine why without additional information but I suspect it has something to do with the cluster being single-node with 5TB of data.

Please log a ticket with DataStax Support so one of our engineers can investigate. Cheers!

Share
10 |1000

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.