Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Casting BigInt to Int in Spark

Hi I'm trying to cast a BigIntto an intin order to generate Rating classes. I only want to use instances that are small enough to fit into an in I use the following code:

val tup=rs.select("kunden_nr","product_list") 
val rdd=tup.rdd.map(row=>(row.getAs[BigInt](0),row.getAs[Seq[Int]](1)))
val fs=rdd.filter(el=>el._1.isValidInt)
fs.count()
rdd.count()

The fs count delivers the following exception in Zepplin:

java.lang.ClassCastException: java.lang.Long cannot be cast to scala.math.BigInt

like image 430
jojo_Berlin Avatar asked Nov 30 '25 07:11

jojo_Berlin


1 Answers

Casting is like changing "the glasses" your code use to represent what is referenced by your value and not actually changing the referenced content nor changing the reference to point to a new BigInt instance.

That implies that you need to get your value with the type it really has and then build a BigInt instance from it:

BigInt(row.getAs[Long](0))

Following the same reasoning, you can create an Int instance from the Long as follows:

row.getAs[Long](0).toInt

But it might overflow the integer type representation range.

like image 51
Pablo Francisco Pérez Hidalgo Avatar answered Dec 02 '25 23:12

Pablo Francisco Pérez Hidalgo



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!