Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark/Scala parallel write to redis

Is it possible to write to Redis in parallel from spark?

(Or: how to write tens of thousands of keys/lists quickly from spark)

Currently, I'm writing to Redis by key in sequence, and it's taking forever. I need to write about 90000 lists (of length 2-2000). Speed is extremely important. Currently, it's taking on the order of 1 hour. Tradition benchmarks of Redis claim thousands of Redis writes per second, but in my pipeline, I'm not anywhere near that.

Any help is appreciated.

like image 517
BBischof Avatar asked Dec 21 '25 22:12

BBischof


1 Answers

A single Redis instance runs in one thread so operations are inherently sequential. If you have a Redis cluster then the instance to which a datum is written depends on a hash slot calculated from the key being written. This hash function (amongst other things) ensures that the load gets distributed across all the Redis instances in the cluster. If you cluster has N instances, you have (almost) at most N parallel writes that you can execute. This because each cluster instance is still a single thread. A reasonable Spark Redis connector should exploit the cluster efficiently.

Either way, Redis is really quick, especially if you use mass inserts.

like image 143
David Weber Avatar answered Dec 23 '25 15:12

David Weber



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!