samedi 28 mars 2015

Apache Spark executor uses more than one core despite of spark.executor.cores=1



When I start Apache Spark application on CentOS 6.5, I receive more than 100% load for executors in accordance to 'top' output and load average is significant more than amount of cores.


As a result I have high load on garbage collector.



  • Have tried to limit cores per executor with spark.executor.cores=1.

  • Have tried spark.cores. No any effect.

  • Hardware is Intel(R) Xeon(R) CPU E5-2620 v2 @ 2.10GHz, 6 physical cores.

  • Deployment model is YARN client.


Similar Ubuntu 14.04 setup with 4 physical cores (Intel i5) has no any issue, 1 core per executor.


Any idea how to fix this?




Aucun commentaire:

Enregistrer un commentaire