Bringing together the Apache Cassandra experts from the community and DataStax.

Want to learn? Have a question? Want to share your expertise? You are in the right place!

Not sure where to begin? Getting Started

 

question

ddbRocks_150730 avatar image
ddbRocks_150730 asked ·

How do I connect to the Spark web UI from host machine to a DSE instance in a Docker container?

I have very limited information on Docker so please bear with this stupid question.

I started dse-server docker using a method that mentions in https://hub.docker.com/r/datastax/dse-server

docker run -e DS_LICENSE=accept --name my-dse -d datastax/dse-server:<version tag> -k

I setted up Studio docker and everything working fine from studio URL. further I run spark command prompt from power shell using "docker exec -it my-dse dse spark" command and that to work and able to see spark shell in my host machine.

Now I want to

1) open spark web UI

2) create one demo application in scala to communicate with the spark which is running in docker



sparkdocker
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

1 Answer

bettina.swynnerton avatar image
bettina.swynnerton answered ·

Hi,

I wouldn't say that this is a stupid question.

I have experimented with publishing ports from container to host, and I managed to access the Spark UI on 127.0.0.1:7080 after starting the container like this:

docker run -e DS_LICENSE=accept -p 7080:7080 -p 7081:7081 --name my-dse -d datastax/dse-server -k 

As you can see, I have also published the ports for the worker UI.

However, the experience is not entirely smooth:

When attempting to click through to the worker information for the Spark shell for example, the UI redirects to <container-id>:7081

You will have to replace this with 127.0.0.1:7081 to see the worker info. At least this way you can get to the logs.

If you want to get to the Spark context Web UI for the Spark shell, you will also need to publish the port 4040, and again you would need to access it via 127.0.0.1:4040

As said, not smooth, but I hope this gets you on the right track.

Regarding your second question:

How do you want to submit your Spark job? Perhaps you can update your initial post with more info about the demo app and how you want to deploy it.

2 comments Share
10 |1000 characters needed characters left characters exceeded

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Thank you bettina.swynnerton. This is what I need to accomplish :) Big Thank you for your help.

Adding to the sample application, As I am exploring docker container for my learning purpose so I don't have any code as of now. I will come up with some sample data and code on it and will share

My problem is resolved so I am accepting it as an answer . Thank you


0 Likes 0 · ·

perfect, I am delighted to hear that this works for you. I might try an example app too, in case there are any problems with submitting the job in this setup.

Cheers!

P.S. I converted your answer to a comment

0 Likes 0 · ·