My Oracle Support Banner

Sqoop Import Into Hive Tables As Parquet Fails on an Encryption Zone (Doc ID 2211778.1)

Last updated on APRIL 08, 2020

Applies to:

Big Data Appliance Integrated Software - Version 4.5.0 and later
Linux x86-64


NOTE: In the examples that follow, user details, cluster names, hostnames, directory paths, filenames, etc. represent a fictitious sample (and are used to provide an illustrative example only). Any similarity to actual persons, or entities, living or dead, is purely coincidental and not intended in any manner. 

Using Sqoop import with --as-parquetfile is failing when trying to create the table on an encryption zone.

INFO mapreduce.Job: Job job_<JOB_ID> failed with state FAILED due to: Job commit failed: Could not move contents of hdfs://<HOSTNAME>/tmp/test/.temp/job_<JOB_ID>/mr/job_<JOB_ID> to hdfs://<HOSTNAME>/data/encryption_zone/test
Caused by: org.apache.hadoop.ipc.RemoteException( /tmp/encryption_zone/.temp/job_1480530522947_0096/mr/job_1480530522947_0096/<ID>.parquet can't be moved into an encryption zone.



To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!

In this Document

My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.