Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

vgauthier avatar image
vgauthier asked vgauthier commented

Unable to run spark-shell remotely, getting "Failed to connect to master"

Hi

I am trying to setup a remote connection to a datastax cluster with the spark-sell. When I try to access to the spark shell on a node of my cluster the command with the command :

dse -u cassandra -p xxx spark

Everything works fine.However when I try remotely with the command:

${SPARK_HOME}/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.11:2.3.0 --conf spark.cassandra.connection.host=172.17.4.253 --conf spark.cassandra.auth.username=cassandra --conf spark.cassandra.auth.password=xxxxxx --master spark://172.17.4.253:7077

I get a connexion refused: Failed to connect to master.

21/04/19 14:53:15 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 172.17.4.253:7077
org.apache.spark.SparkException: Exception thrown in awaitResult:
 at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301)
 at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
 at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
 at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
 at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$1.run(StandaloneAppClient.scala:107)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
 at java.util.concurrent.FutureTask.run(FutureTask.java:266)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.lang.IllegalStateException: Expected SaslMessage, received something else (maybe your client does not have SASL enabled?)
spark-cassandra-connector
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

jaroslaw.grabowski_50515 avatar image
jaroslaw.grabowski_50515 answered vgauthier commented

Do you have a standalone Spark cluster? It seams like you are trying to use a standalone Spark cluster to connect to DSE. You need to specify the correct master address. 172.17.4.253 is probably not what you want, as judging by your command, this is where DSE (and DSE Spark) lives.

1 comment Share
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Hello Jaroslaw,

No I don't have a standalone Spark cluster, I use the DSE Spark. The master address seems fine to me when I use the dse command line to access the spark shell everything works great.

dse spark --master  spark://172.17.4.253:7077

Moreover when I tried to connect to random address I get the traditional error message: "Connection refused"

Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused

Instead when I try to connect to the DSE spark master I get "a connection failled" error

0 Likes 0 ·