scala - Not able to start spark-shell as it yields error over hadoop cluster steup, however, works fine without hadoop cluster -


when remove hadoop cluster set folder spark-shell works fine, however, if tried hadoop cluster set folder sparke-shell yields various errors "error while instantiating 'org.apache.spark.sql.hive.hivesessionstate' in spark" didn't configure hive anywhere. note that, tried shut down cluster of haddop , spark, spark-shell yields following error:

error during excecution of ./bin/spark-shell

second part of same error

run : mkdir /user/$whoami/spark-warehouse

then run : spark-shell --conf spark.sql.warehouse.dir=file:///user/$whoami/spark-warehouse


Comments

Popular posts from this blog

'hasOwnProperty' in javascript -

How to understand 2 main() functions after using uftrace to profile the C++ program? -

android - Unable to generate FCM token from dynamically instantiated Firebase -