Defined Spark Permanent UDF which can see in metastore but can not use in hive SQL on Spark -
create function hello 'com.dtstack.main.udf.helloudf' using jar 'hdfs:///172.16.1.151:9000/user/spark/sparkudf.jar'
and used
select hello(xcval) xctable
error: org.apache.spark.sql.analysisexception: undefined function: 'hello'. function neither registered temporary function nor permanent function registered in database 'default'.; line 1 pos 7
can me?
for creating permanent function in hive, need have jar placed on hive.auxiliary.path.
hive.auxiliary.path default location hive read udf, if jar file not available on location won't able access it.
because when create function, hive know's location of jar "hdfs:///172.16.1.151:9000/user/spark/sparkudf.jar" make available spark have deploy on auxiliary path because once hive session closes, hive stores definition of function not location , location go auxiliary path.
for more information around udf deployment please have @ https://www.cloudera.com/documentation/enterprise/5-4-x/topics/cm_mc_hive_udf.html
Comments
Post a Comment