Hello Community,
I have a question regarding the deployment of DSE on Kubernetes.
I have followed the setup instructions provided on the DataStax website/github.
Steps:
- - Created storage class (cass-operator/operator/k8s-flavors/gke/storage.yaml)
- - Created the operator manifests (cass-operator/docs/user/cass-operator-manifests.yaml)
- - Created the Cluster (cass-operator/operator/example-cassdc-yaml/dse-6.8.x/example-cassdc-three-rack-three-node.yaml)
- - Checked if cluster is live with nodetool status, database is also running.
- - Start thriftserver: dse spark-sql-thriftserver start
Thriftserver complains about authentication.
The setting for authentication on the deployment yaml is:
config: cassandra-yaml: num_tokens: 8 authenticator: com.datastax.bdp.cassandra.auth.DseAuthenticator authorizer: com.datastax.bdp.cassandra.auth.DseAuthorizer role_manager: com.datastax.bdp.cassandra.auth.DseRoleManager dse-yaml: authorization_options: enabled: true authentication_options: enabled: true default_scheme: internal
If I try starting Thriftserver again, it says it is running. But the process is defunct.
dse@cluster2-dc1-rack1-sts-0:~$ dse spark-sql-thriftserver start org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 running as process 2053. Stop it first.
dse@cluster2-dc1-rack1-sts-0:~$ ps aux USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND dse 1 0.4 1.1 3643524 189112 ? Ssl 11:02 0:21 java -Xms128m -Xmx128m -jar /opt/dse/resources/management-api/management-api-6.8.4-all.jar --dse-socket /tmp/dse.sock --h dse 816 71.8 16.3 4400432 2680292 ? Sl 11:02 62:29 /usr/local/openjdk-8/bin/java -Ddse.server_process -XX:+UnlockDiagnosticVMOptions -XX:+AlwaysPreTouch -Dcassandra.disable dse 1397 1.1 0.2 398804 37472 ? Ssl 11:03 0:57 collectd_wrapper /opt/dse/resources/dse/collectd/usr/sbin/collectd -C /tmp/dse/dse.NbJPfUW6w/dse-collectd-520598997774501 dse 1518 0.0 0.0 6988 3780 pts/0 Ss 11:07 0:00 bash dse 2053 0.2 0.0 0 0 pts/0 Z 11:26 0:10 [java] <defunct> dse 4056 0.0 0.0 10628 3160 pts/0 R+ 12:29 0:00 ps aux
Trying to connect via beeline by port forwarding to my machine does not work.
Trying to connect via beeline inside the pod does not work:
dse@cluster2-dc1-rack1-sts-0:~$ /opt/dse/resources/spark/bin/beeline -u jdbc:hive2://localhost:9160/default -n cluster2-superuser -p superpassword /opt/dse/resources/spark/conf/spark-env.sh: line 155: /dse-spark-env.sh: No such file or directory Error: Could not find or load main class org.apache.spark.launcher.Main
dse@cluster2-dc1-rack1-sts-0:~$ ls -alh /opt/dse/resources/spark/conf/spark-env.sh lrwxrwxrwx 1 dse dse 20 Oct 13 11:02 /opt/dse/resources/spark/conf/spark-env.sh -> /config/spark-env.sh
Can someone point me in the right direction what I need to change/fix?
Thank you.
Michael.
Logs:
dse@cluster2-dc1-rack1-sts-0:~$ cat /opt/dse/spark-thrift-server/spark--org.apache.spark.sql.hive.thriftserver-HiveThriftServer2-1-cluster2-dc1-rack1-sts-0.out Spark Command: /usr/local/openjdk-8/bin/java -cp /opt/dse/resources/spark/conf/:/opt/dse/resources/spark/jars/*:/opt/dse/resources/hadoop2-client/conf/ -Djava.library.path=/opt/dse/resources/hadoop2-client/lib/native:/opt/dse/resources/cassandra/lib/sigar-bin: -Dcassandra.logdir=/var/log/cassandra -XX:MaxHeapFreeRatio=50 -XX:MinHeapFreeRatio=20 -Dguice_include_stack_traces=OFF -Ddse.system_memory_in_mb=16014 -Dcassandra.config.loader=com.datastax.bdp.config.DseConfigurationLoader -Dlogback.configurationFile=/opt/dse/resources/spark/conf/logback-spark.xml -Dcassandra.logdir=/var/log/cassandra -Ddse.client.configuration.impl=com.datastax.bdp.transport.client.HadoopBasedClientConfiguration -Dderby.stream.error.method=com.datastax.bdp.derby.LogbackBridge.getLogger -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --name Thrift JDBC/ODBC Server spark-internal ======================================== ERROR 2021-10-13 11:26:31,176 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application because of java.io.IOException: Failed to open native connection to Cassandra at {127.0.0.1}:9042 java.io.IOException: Failed to open native connection to Cassandra at {127.0.0.1}:9042 at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:184) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$10.apply(CassandraConnector.scala:167) at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$10.apply(CassandraConnector.scala:167) at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32) at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69) at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57) at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79) at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:114) at org.apache.spark.deploy.SparkConfigurator$$anonfun$dynamicConfiguration$2.apply(SparkConfigurator.scala:106) at org.apache.spark.deploy.SparkConfigurator$$anonfun$dynamicConfiguration$2.apply(SparkConfigurator.scala:105) at scala.util.Try$.apply(Try.scala:192) at com.datastax.bdp.util.Lazy.internal$lzycompute(Lazy.scala:26) at com.datastax.bdp.util.Lazy.internal(Lazy.scala:25) at com.datastax.bdp.util.Lazy.get(Lazy.scala:31) at org.apache.spark.deploy.SparkConfigurator.dseDriverProps$lzycompute(SparkConfigurator.scala:190) at org.apache.spark.deploy.SparkConfigurator.dseDriverProps(SparkConfigurator.scala:189) at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries$lzycompute(SparkConfigurator.scala:160) at org.apache.spark.deploy.SparkConfigurator.dseSparkConfEntries(SparkConfigurator.scala:160) at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs$lzycompute(DseSparkArgsPreprocessor.scala:86) at org.apache.spark.deploy.DseSparkArgsPreprocessor.updatedArgs(DseSparkArgsPreprocessor.scala:75) at org.apache.spark.deploy.DseSparkSubmitBootstrapper$.main(DseSparkSubmitBootstrapper.scala:98) at org.apache.spark.deploy.DseSparkSubmitBootstrapper.main(DseSparkSubmitBootstrapper.scala) Caused by: com.datastax.driver.core.exceptions.AuthenticationException: Authentication error on host localhost/127.0.0.1:9042: Host localhost/127.0.0.1:9042 requires authentication, but no authenticator found in Cluster configuration at com.datastax.driver.core.Connection$9.apply(Connection.java:548) at com.datastax.driver.core.Connection$9.apply(Connection.java:505) at com.google.common.util.concurrent.Futures$AsyncChainingFuture.doTransform(Futures.java:1442) at com.google.common.util.concurrent.Futures$AsyncChainingFuture.doTransform(Futures.java:1433) at com.google.common.util.concurrent.Futures$AbstractChainingFuture.run(Futures.java:1408) at com.google.common.util.concurrent.MoreExecutors$DirectExecutor.execute(MoreExecutors.java:456) at com.google.common.util.concurrent.AbstractFuture.executeListener(AbstractFuture.java:817) at com.google.common.util.concurrent.AbstractFuture.complete(AbstractFuture.java:753) at com.google.common.util.concurrent.AbstractFuture.set(AbstractFuture.java:613) at com.datastax.driver.core.Connection$Future.onSet(Connection.java:1554) at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1307) at com.datastax.driver.core.Connection$Dispatcher.channelRead0(Connection.java:1214) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965) at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:808) at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:474) at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:370) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748)