scala - Not able to start spark-shell as it yields error over hadoop cluster steup, however, works fine without hadoop cluster -


when remove hadoop cluster set folder spark-shell works fine, however, if tried hadoop cluster set folder sparke-shell yields various errors "error while instantiating 'org.apache.spark.sql.hive.hivesessionstate' in spark" didn't configure hive anywhere. note that, tried shut down cluster of haddop , spark, spark-shell yields following error:

error during excecution of ./bin/spark-shell

second part of same error

run : mkdir /user/$whoami/spark-warehouse

then run : spark-shell --conf spark.sql.warehouse.dir=file:///user/$whoami/spark-warehouse


Comments

Popular posts from this blog

'hasOwnProperty' in javascript -

python - ValueError: No axis named 1 for object type <class 'pandas.core.series.Series'> -

java - How to provide dependency injections in Eclipse RCP 3.x? -