Screen Link: https://app.dataquest.io/m/127/project%3A-spark-installation-and-jupyter-notebook-integration/4/pyspark-shell
I’m working through pyspark installation. I have installed Java and Spark on my computer, and unzipped the tar file.
Despite searching extensively across stack overflow and reviewing the apache spark documentation, I cannot get pyspark to launch. I have included a screenshot below of the output from running the bin\pyspark command.
I’m really stuck as to what I need to do to rectify this. I’ve appended the spark location to a path environment variable, and set the SPARK_HOME to the respective location also. Any suggestions/guidance would be really appreciated!
Thanks,
George