Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I set spark.sql.debug.maxToStringFields?

I have looked through similar questions that have been asked before. No luck so far. I'm using PySpark within a venv environment. How do I go about changing the setting? Do I do it from within jupyter notebook/python script? Or do I need to use bash command? Is it in a specific configuration file? If so, where is it located?

like image 277
forever_learner Avatar asked Dec 17 '25 15:12

forever_learner


2 Answers

You can set it up in your .config file or you can use the command line

spark.conf.set("spark.sql.debug.maxToStringFields", <value>)
like image 178
Norwegian Salmon Avatar answered Dec 20 '25 06:12

Norwegian Salmon


This config, along many others, has been moved to: SQLConf - sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala

This can be set either in the config file or via command line in spark, using:

spark.conf.set("spark.sql.debug.maxToStringFields", 1000)

like image 37
ozeyboy Avatar answered Dec 20 '25 05:12

ozeyboy



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!