hadoop - How can i import a column type SDO_GEOMETRY from Oracle to HDFS with Sqoop? -
issue
i'm using sqoop fetch data oracle , put hdfs. unlike other basic datatypes understand sdo_geometry meant spatial data.
my sqoop job fails while fetching datatype sdo_geometry.
need import column shape sdo_geometry datatype oracle hdfs.
i have more 1000 tables has sdo_geometry datatype , how can handle datatype in general while sqoop imports happen ?
i have tried --map-column-java , --map-column-hive , still error.
error :
error tool.importtool: encountered ioexception running import job: java.io.ioexception: hive not support sql type column shape
sqoop command
below sqoop command have :
sqoop import --connect 'jdbc:oracle:thin:xxxxx/xxxxx@(description=(address=(protocol=tcp)(host=xxxxxxx)(port=1521))(connect_data=(sid=xxxxx)))' -m 1 --create-hive-table --hive-import --fields-terminated-by '^' --null-string '\\\\n' --null-non-string '\\\\n' --hive-overwrite --hive-table prod.plan1 --target-dir test/plan1 --table prod.plan --map-column-hive se_xao_cad_data=binary --map-column-java shape=string --map-column-hive shape=string --delete-target-dir
the default type mapping sqoop provides between relational databases , hadoop not working in case why sqoop job fails. need override mapping geometry datatypes not supported sqoop.
use below parameter in sqoop job
syntax:--map-column-java col1=javadatatype,col2=javadatatype.....
sqoop import ....... ........ --map-column-java columnnameforsdo_geometry=string
as column name shape
--map-column-java shape=string
Comments
Post a Comment