My Oracle Support Banner

Sqoop Import from Oracle Database into HCatalog Table in Parquet Format Fails with 'java.lang.RuntimeException: Should never be used' Error (Doc ID 1968501.1)

Last updated on OCTOBER 11, 2016

Applies to:

Big Data Appliance Integrated Software - Version 3.1.0 and later
Linux x86-64


On Oracle Big Data Appliance(BDA) performing Sqoop import command from Oracle database into HCatalog table in PARQUET format fails with errors

sqoop import -Doraoop.chunk.method=ROWID -Doraoop.timestamp.string=false"file:///dev/urandom""" -Doraoop.import.partitions=SYS_P4341 --direct --connect jdbc:oracle:thin:@//<DBNode>:1521/<SID> --table WAREHOUSE.TRANSACTION_ITEM_FCT --columns prod_id,transaction_fid,item_qty,card_id,date_id,spend_amt,store_id --create-hcatalog-table --hcatalog-database kroger_mc_p --hcatalog-storage-stanza "STORED AS PARQUET" --map-column-hive store_id=bigint,card_id=bigint,transaction_fid=bigint,prod_id=bigint,date_id=timestamp --m 96 --username EXA_KROUSETLP[MARKETPLACE] --password-file /**/oracle_pass --hcatalog-table TRANSACTION_ITEM_FCT_TEST_JOB

Mappers are failing with the following error ...

15/02/03 09:36:03 INFO mapreduce.Job: Task Id : attempt_1422914679712_0027_m_000051_2, Status : FAILED
Error: java.lang.RuntimeException: Should never be used
  at org.apache.hive.hcatalog.mapreduce.FileOutputFormatContainer.getRecordWriter(
  at org.apache.hive.hcatalog.mapreduce.HCatOutputFormat.getRecordWriter(
  at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.(

The same command using --hcatalog-storage-stanza "STORED AS SEQUENCEFILE" is functioning as expected.


To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!

In this Document

My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.