I have a CLI application for transforming JSONs. Most of it's code is mapping, flatMapping and traversing with for Lists of JValues. Now I want to port this application to Spark, but seems I need to rewrite all functions 1:1, but write RDD[JValue] instead of List[JValue].
Is there any way (like type class) for function to accept both Lists and RDDs.
If you want to share your code for processing local & abstract code you can move your lambdas/anaonymous functions that you pass in to map/flatMap into named functions and re-use them.
If you want to re-use your logic for how to order the maps/flatMaps/etc, you could also create an implicit conversions between both RDD and Seq to a custom trait which has only the shared functions but implicit conversions can become quite confusing and I don't really think this is a good idea (but you could do it if you disagree with me :)).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With