Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to parse string to array in Spark?

How to flatten Array of Strings into multiple rows of a dataframe in Spark 2.2.0?

Input Row ["foo", "bar"]

val inputDS = Seq("""["foo", "bar"]""").toDF

inputDS.printSchema()

root
 |-- value: string (nullable = true)

Input Dataset inputDS

inputDS.show(false)

value
-----
["foo", "bar"]

Expected output dataset outputDS

value
-------
"foo" |
"bar" |

I tried explode function like below but it didn't quite work

inputDS.select(explode(from_json(col("value"), ArrayType(StringType))))

and I get the following error

org.apache.spark.sql.AnalysisException: cannot resolve 'jsontostructs(`value`)' due to data type mismatch: Input schema string must be a struct or an array of structs

Also tried the following

inputDS.select(explode(col("value")))

And I get the following error

org.apache.spark.sql.AnalysisException: cannot resolve 'explode(`value`)' due to data type mismatch: input to function explode should be array or map type, not StringType
like image 402
user1870400 Avatar asked Feb 02 '26 22:02

user1870400


1 Answers

Exception is thrown by:

from_json(col("value"), ArrayType(StringType))

not explode, specifically:

Input schema array must be a struct or an array of structs.

You can:

inputDS.selectExpr(
  "split(substring(value, 2, length(value) - 2), ',\\s+') as value")

and explode the output.

like image 164
Alper t. Turker Avatar answered Feb 05 '26 13:02

Alper t. Turker



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!