Error with findspark

ModuleNotFoundError: No module named ‘findspark’
how to solve this error on jupyter notebook

You may not have installed the module --> pip install it?
If you did, the module may have been installed in the wrong folder (they normally go in site-packages folder if you use anaconda on Windows) --> Try to paste the folder into the same place as other libraries that work normally for you?

More rarely it’s a problem with the module designer. (They did their relative imports during setup wrongly, like from folder import xxx rather than from .folder import xxx)

You can follow these steps

  1. download spark
  2. unzip to the desired location
  3. open jupyter notebook in the usual way nothing special
  4. now run the below code
import findspark
findspark.init("location of spark folder ")

# in my case it is like 

import findspark
findspark.init("/Users/josuaantonius/Downloads/spark-3.0.1-bin-hadoop2.7")

it works for me