Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

Sugumar.Vijay avatar image
Sugumar.Vijay asked ·

Not able to capture driver and executor logs in console

Running the spark job using the below command, but not able to capture the driver and executor logs in console

# Run on a Spark standalone cluster in cluster deploy mode with supervise
./bin/spark-submit \  --class org.apache.spark.examples.SparkPi \  --master spark://207.184.161.138:7077 \  --deploy-mode cluster \  --supervise \  --executor-memory 20G \  --total-executor-cores 100 \  /path/to/examples.jar \  1000
spark
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

Erick Ramirez avatar image
Erick Ramirez answered ·

The log output of each job goes to stdout and stderr log files which are stored in the $SPARK_HOME/work directory by default, not to an actual console like regular Unix commands/scripts/apps.

But since your question is very specific to Apache Spark, I suggest you ask in the Spark Community since that's the right place for it. Cheers!

Share
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.