scala - Not able to run Spark job on YARN cluster -


i have simple hadoop cluster on top of spark runs (that spark uses yarn cluster manager).

i using hadoop 2.7; scala 2.112.1; spark 2.1.0 , jdk 8.

now, when submit job, fails, message below:

17/04/06 23:57:55 info yarn.client: application report application_1491534363989_0004 (state: accepted) 17/04/06 23:57:56 info yarn.client: application report application_1491534363989_0004 (state: failed) 17/04/06 23:57:56 info yarn.client:      client token: n/a      diagnostics: application application_1491534363989_0004 failed 2 times due container appattempt_1491534363989_0004_000002 exited  exitcode: 15 more detailed output, check application tracking page:http://rm100.hadoop.cluster:8088/cluster/app/application_1491534363989_0004then, click on links logs of each attempt. diagnostics: exception container-launch. container id: container_1491534363989_0004_02_000001 exit code: 15 

are there issues jdk 8?

update

when run same program using jdk 7, working fine. question is: spark, scala , hadoop having issues jdk 8?

i have been using spark on yarn cluster using java 8 , runs smoothly. know newer version of spark , scala needs java 8 or above. here few things need consider.

  1. check java_home path in hadoop-env.sh
  2. when start yarn cluster make sure required nodes using jps.
  3. check logs in hadoop logs.
  4. go http://rm100.hadoop.cluster:8088/cluster/app/application_1491534363989_0004 more details

Comments

Popular posts from this blog

Command prompt result in label. Python 2.7 -

javascript - How do I use URL parameters to change link href on page? -

amazon web services - AWS Route53 Trying To Get Site To Resolve To www -