Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Where to modify spark-defaults.conf if I installed pyspark via pip install pyspark

I installed pyspark 3.2.0 via pip install pyspark. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark since that is my understanding of what SPARK_HOME should be.

  1. Where can I find spark-defaults.conf? I want to modify it
  2. Am I right in setting SPARK_HOME to the installation location of pyspark ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark?
like image 614
Nitin Siwach Avatar asked Aug 30 '25 16:08

Nitin Siwach


1 Answers

2. The SPARK_HOME environment variables are configured correctly.

1. In the pip installation environment, the $SPARK_HOME/conf directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.

like image 121
过过招 Avatar answered Sep 03 '25 10:09

过过招