scala - Not able to start spark-shell as it yields error over hadoop cluster steup, however, works fine without hadoop cluster -


when remove hadoop cluster set folder spark-shell works fine, however, if tried hadoop cluster set folder sparke-shell yields various errors "error while instantiating 'org.apache.spark.sql.hive.hivesessionstate' in spark" didn't configure hive anywhere. note that, tried shut down cluster of haddop , spark, spark-shell yields following error:

error during excecution of ./bin/spark-shell

second part of same error

run : mkdir /user/$whoami/spark-warehouse

then run : spark-shell --conf spark.sql.warehouse.dir=file:///user/$whoami/spark-warehouse


Comments

Popular posts from this blog

Command prompt result in label. Python 2.7 -

javascript - How do I use URL parameters to change link href on page? -

amazon web services - AWS Route53 Trying To Get Site To Resolve To www -