scala - Not able to run Spark job on YARN cluster -
i have simple hadoop cluster on top of spark runs (that spark uses yarn cluster manager).
i using hadoop 2.7; scala 2.112.1; spark 2.1.0 , jdk 8.
now, when submit job, fails, message below:
17/04/06 23:57:55 info yarn.client: application report application_1491534363989_0004 (state: accepted) 17/04/06 23:57:56 info yarn.client: application report application_1491534363989_0004 (state: failed) 17/04/06 23:57:56 info yarn.client: client token: n/a diagnostics: application application_1491534363989_0004 failed 2 times due container appattempt_1491534363989_0004_000002 exited exitcode: 15 more detailed output, check application tracking page:http://rm100.hadoop.cluster:8088/cluster/app/application_1491534363989_0004then, click on links logs of each attempt. diagnostics: exception container-launch. container id: container_1491534363989_0004_02_000001 exit code: 15
are there issues jdk 8?
update
when run same program using jdk 7, working fine. question is: spark, scala , hadoop having issues jdk 8?
i have been using spark on yarn cluster using java 8 , runs smoothly. know newer version of spark , scala needs java 8 or above. here few things need consider.
- check java_home path in hadoop-env.sh
- when start yarn cluster make sure required nodes using
jps
. - check logs in hadoop logs.
- go
http://rm100.hadoop.cluster:8088/cluster/app/application_1491534363989_0004
more details
Comments
Post a Comment