I installed pyspark 3.2.0 via pip install pyspark
. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf
. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark
since that is my understanding of what SPARK_HOME should be.
~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark
?2
. The SPARK_HOME
environment variables are configured correctly.
1
. In the pip installation environment, the $SPARK_HOME/conf
directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With