Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

spark - where is spark.sql.legacy.timeParserPolicy documented?

Migration Guide: SQL, Datasets and DataFram refers to spark.sql.legacy.timeParserPolicy

Parsing/formatting of timestamp/date strings. This effects on CSV/JSON datasources and on the unix_timestamp, date_format, to_unix_timestamp, from_unixtime, to_date, to_timestamp functions when patterns specified by users is used for parsing and formatting. In Spark 3.0, we define our own pattern strings in Datetime Patterns for Formatting and Parsing, which is implemented via DateTimeFormatter under the hood. New implementation performs strict checking of its input. For example, the 2015-07-22 10:00:00 timestamp cannot be parse if pattern is yyyy-MM-dd because the parser does not consume whole input. Another example is the 31/01/2015 00:00 input cannot be parsed by the dd/MM/yyyy hh:mm pattern because hh supposes hours in the range 1-12. In Spark version 2.4 and below, java.text.SimpleDateFormat is used for timestamp/date string conversions, and the supported patterns are described in SimpleDateFormat. The old behavior can be restored by setting spark.sql.legacy.timeParserPolicy to LEGACY.

Spark Application Properties does not have the explanation. Where is it documented and what values are available to specify?

like image 543
mon Avatar asked Oct 26 '25 23:10

mon


1 Answers

For some reason it does not appear it Spark configurations docs but you can find it in SQLConf.scala:

When LEGACY, java.text.SimpleDateFormat is used for formatting and parsing dates/timestamps in a locale-sensitive manner, which is the approach before Spark 3.0. When set to CORRECTED, classes from java.time.* packages are used for the same purpose. The default value is EXCEPTION, RuntimeException is thrown when we will get different results.

like image 119
blackbishop Avatar answered Oct 29 '25 18:10

blackbishop



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!