jeudi 5 mars 2015

MapReduce to Spark



I have a MapReduce job written in Java. It depends on multiple classes. I want to run the MapReduce job on Spark.


What steps should I follow to do the same?


I need to make changes only to the MapReduce class?


Thanks!




Aucun commentaire:

Enregistrer un commentaire