The image shows 8081 UI.
The master shows running application when I start a scala shell or pyspark shell. But when I use spark-submit
to run a python script, master doesn't show any running application. This is the command I used: spark-submit --master spark://localhost:7077 sample_map.py
. The web UI is at :4040. I want to know if I'm doing it the right way for submitting scripts or if spark-submit never really shows running application.
localhost:8080
or <master_ip>:8080
doesn't open for me but <master_ip>:8081
opens. It shows the executor info.
These are my configurations in spark-env.sh:
export SPARK_EXECUTOR_MEMORY=512m
export SPARK_MASTER_WEBUI_PORT=4040
export SPARK_WORKER_CORES=2
export SPARK_WORKER_MEMORY=1g
export SPARK_WORKER_INSTANCES=2
export SPARK_WORKER_DIR=/opt/worker
export SPARK_DAEMON_MEMORY=512m
export SPARK_LOCAL_DIRS=/tmp/spark
export SPARK_MASTER_IP 'splunk_dep'
I'm using CentOS
, python 2.7
and spark-2.0.2-bin-hadoop2.7.
You can open spark master’s web UI, which is http://localhost:8080 by default to see running apps (in standalone cluster mode) :
If multiple apps are running - they will bind to port 4040, 4041, 4042 ...
You can access this interface by simply opening http://:4040 in a web browser. If multiple SparkContexts are running on the same host, they will bind to successive ports beginning with 4040 (4041, 4042, etc).
For Local run use this:
val sparkConf = new SparkConf().setAppName("Your app Name").setMaster("local")
val sc = new SparkContext(sparkConf)
while you do sparkSubmit:
val sparkConf = new SparkConf().setAppName("Your app Name")
val sc = new SparkContext(sparkConf)
This won't work in local test but when you compile with this and spark submit job it will show in UI.
Hope this explains.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With