Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ipython is not recognized as an internal or external command (pyspark)

I have installed spark the release: spark-2.2.0-bin-hadoop2.7.

I'm using Windows 10 OS

My java version 1.8.0_144

I have set my environment variables:

SPARK_HOME D:\spark-2.2.0-bin-hadoop2.7

HADOOP_HOME D:\Hadoop ( where I put bin\winutils.exe )

PYSPARK_DRIVER_PYTHON ipython

PYSPARK_DRIVER_PYTHON_OPTS notebook

Path is D:\spark-2.2.0-bin-hadoop2.7\bin

When I launch pyspark from command line I have this error:

ipython is not recognized as an internal or external command

I tried also to set PYSPARK_DRIVER_PYTHON in jupyter but and it's giving me the same error (not recognized as an internal or external command).

Any help please?

like image 488
Eliane PDC Avatar asked Oct 27 '25 05:10

Eliane PDC


1 Answers

Search in your machine the ipython application, in my case it is in "c:\Anaconda3\Scripts". Then just add that path to the PATH Environment Variables

like image 76
AlexDR Avatar answered Oct 29 '25 07:10

AlexDR



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!