scala - Not able to start spark-shell as it yields error over hadoop cluster steup, however, works fine without hadoop cluster -
when remove hadoop cluster set folder spark-shell works fine, however, if tried hadoop cluster set folder sparke-shell yields various errors "error while instantiating 'org.apache.spark.sql.hive.hivesessionstate' in spark" didn't configure hive anywhere. note that, tried shut down cluster of haddop , spark, spark-shell yields following error:
run : mkdir /user/$whoami/spark-warehouse
then run : spark-shell --conf spark.sql.warehouse.dir=file:///user/$whoami/spark-warehouse
Comments
Post a Comment