Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to concatenate a string and a column in a dataframe in spark?

I have today's date as a string. I need to concatenate it with a time value that is present as a column in a dataframe.

When I try this, I get String Index out of bounds exception.

My code:

val todaydate = LocalDate.now().toString()
println(todaydate)  // o/p: 2016-12-10

val todayrec_cutoff = todaydate + (" ") + df.col("colname")

Expected Output:

2016-12-10 05:00 
2016-12-10 22:30
like image 667
Dasarathy D R Avatar asked Oct 16 '25 14:10

Dasarathy D R


1 Answers

**Please refer to below Scala code for string concat in prefix and postfix way.**


import org.apache.spark.sql.functions._
val empDF =  MongoSpark.load(spark, readConfig) //dataframe empDF is loaded from Mongo DB using MongoSpark 

val prefixVal= "PrefixArkay " //variable string
val postfixVal= " PostfixArkay"

//Prefix
val finalPreDF = ipDF.withColumn("EMP", concat(lit(prefix),empDF.col("EMP")) )
println("finalPreDF.show-> " + finalPreDF.show())

//Output will be as below
+-------------------+
|                EMP|
+-------------------+
|PrefixArkay DineshS|
|+------------------+


val finalPostDF = ipDF.withColumn("EMP", concat(empDF.col("EMP"),lit(postfixVal)) )
println("finalPostDF.show-> " + finalPostDF .show())

//Output will be as below
+--------------------+
|                 EMP|
+--------------------+
|DineshS PostfixArkay|
|+-------------------+
like image 148
Dinesh Shinkar Avatar answered Oct 18 '25 07:10

Dinesh Shinkar



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!