Apache Spark standalone cluster on Windows

  1. Worker
  2. Resource Manager
Spark cluster overview
  1. Always start Command Prompt with Administrator rights i.e with Run As Administrator option
  1. Download Spark and add SPARK_HOME=<path_to_spark>. If you choose to download spark pre-built with particular version of hadoop, no need to download it explicitly in step 3.
  2. Download Hadoop and add HADOOP_HOME=<path_to_hadoop> and add %HADOOP_HOME%\bin to PATH variable.
  3. Download winutils.exe (for the same Hadoop version as above) and place it under %HADOOP_HOME%\bin.
bin\spark-class org.apache.spark.deploy.master.Master --host <IP_Addr>
bin\spark-class org.apache.spark.deploy.worker.Worker spark://<master_ip>:<port> --host <IP_ADDR>
http://<MASTER_IP>:8080
Spark UI

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Amar Gajbhiye

Amar Gajbhiye

Technology Enthusiast | Big Data Developer | Amateur Cricketer | Technical Lead Engineer @ eQ Technologic | https://www.bugdbug.com