Spark Installation Issue

Screen Link:

I’m working through pyspark installation. I have installed Java and Spark on my computer, and unzipped the tar file.

Despite searching extensively across stack overflow and reviewing the apache spark documentation, I cannot get pyspark to launch. I have included a screenshot below of the output from running the bin\pyspark command.

I’m really stuck as to what I need to do to rectify this. I’ve appended the spark location to a path environment variable, and set the SPARK_HOME to the respective location also. Any suggestions/guidance would be really appreciated!



@boormang: have not ventured into spark or installing it myself so I’m bumping up this topic in case any of the other volunteers have experience with this but missed it.

1 Like

Hi boormang

Please refer to this article. I personally found it quite helpful.




Not sure if anyone here needs help with this still but I personally found this link helpful. Use the commands listed in the Anaconda command prompt.

Hope it helps!

1 Like