Spark Command: /usr/lib/jvm/default-java/jre/bin/java -cp /usr/lib/spark/conf/:/usr/lib/spark/jars/* -Xmx1g .master.Master -host 192.168.0.102 -port 7077 -webui-port 8080 You would see the following in the log file, specifying ip address of the master node, the port on which spark has been started, port number on which WEB UI has been started, etc. Starting .master.Master, logging to /usr/lib/spark/logs/. Goto SPARK_HOME/sbin and execute the following command. Replace the ip with the ip address assigned to your computer (which you would like to make as a master). # - SPARK_MASTER_PORT / SPARK_MASTER_WEBUI_PORT, to use non-default ports for the master # - SPARK_MASTER_HOST, to bind the master to a different IP address or hostname Spark-env.sh # Options for the daemons used in the standalone deploy mode Part of the file with SPARK_MASTER_HOST addition is shown below:
Make a copy of spark-env.sh.template with name spark-env.sh and add/edit the field SPARK_MASTER_HOST. Note : If spark-env.sh is not present, spark-env.sh.template would be present. Edit the file spark-env.sh – Set SPARK_MASTER_HOST. SPARK_HOME is the complete path to root directory of Apache Spark in your computer.Ģ. Navigate to Spark Configuration Directory. Execute the following steps on the node, which you want to be a Master.ġ. To Setup an Apache Spark Cluster, we need to know two things :įollowing is a step by step guide to setup Master node for an Apache Spark cluster.