sppark on yarn

 spark on yarn

--------------

how to execute spark programs

--interactive mode--spark-shell/pyspark/notebook--

good to analyzing


submitting job-

----------------

spark-submit

--production ready code.


how spark execute our programs on cluster:

---------------------------


follows--master slave architecture.

--each application has a single driver--master process

--multiple executer(cpu and memory,/jvm process)


driver--

--------------

analyzing works--divide it--distribute task,schedule and monitors


executer

----------

execute the code locally on that JVM


each aplication has distinct driver and bunch of executer


Comments