Hi
I am trying to setup a remote connection to a datastax cluster with the spark-sell. When I try to access to the spark shell on a node of my cluster the command with the command :
dse -u cassandra -p xxx spark
Everything works fine.However when I try remotely with the command:
${SPARK_HOME}/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.11:2.3.0 --conf spark.cassandra.connection.host=172.17.4.253 --conf spark.cassandra.auth.username=cassandra --conf spark.cassandra.auth.password=xxxxxx --master spark://172.17.4.253:7077
I get a connexion refused: Failed to connect to master.
21/04/19 14:53:15 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 172.17.4.253:7077 org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101) at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109) at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anon$1.run(StandaloneAppClient.scala:107) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.RuntimeException: java.lang.IllegalStateException: Expected SaslMessage, received something else (maybe your client does not have SASL enabled?)