Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark Sql is throwing PermGen Space Error

I have configured 3 Node Spark Cluster. And started Spark Thrift Service using start-thriftserver.sh script with some custom properties. And also added spark.executor.extraJavaOptions -XX:MaxPermSize=1024m -XX:PermSize=256m property in each spark-default.sh file of the Cluster.

Using the Hive JDBC driver, I am able to connect with spark-sql, have tried some queries on it.

But after some time it's throwing PermGen Space error. And after restarting thrift service so many times it is throwing same error.

enter image description here

like image 630
Kaushal Avatar asked Mar 25 '26 06:03

Kaushal


1 Answers

Finally I got the solution.

I went through the application log, permgen error occurs by the Spark driver so Instead of spark.executor.extraJavaOptions option, I added -XX:MaxPermSize=1024m -XX:PermSize=256m properties with spark.driver.extraJavaOptions option.

like image 138
Kaushal Avatar answered Mar 26 '26 22:03

Kaushal