Sqoop Hcatalog/Hive Import into Partitioned Table Causes Error When Sentry is Enabled (Doc ID 1968525.1)

Last updated on OCTOBER 11, 2016

Applies to:

Big Data Appliance Integrated Software - Version 3.1.0 and later
Linux x86-64

Symptoms

Using the following Sqoop hcatalog command to import into a partitioned table results in a permissions error when sentry is enabled.

sqoop import -Dmapred.job.queue.name=MC-ETL-KROGER -Doraoop.chunk.method=ROWID -Doraoop.timestamp.string=false -Djava.security.egd=file:///dev/urandom -Dmapred.child.java.opts=-Djava.security.egd=file:///dev/urandom -Doraoop.import.partitions=SYS_P4341 --direct --connect jdbc:oracle:thin:@/<DBHost>/:1521/<Servicename/SID> --table WAREHOUSE.TRANSACTION_ITEM_FCT --columns prod_id,transaction_fid,item_qty,card_id,date_id,spend_amt,store_id --hcatalog-table TRANSACTION_ITEM_FCT_TESTEY --hive-partition-key dib_time_code --hive-partition-value 20140102 --create-hcatalog-table --hcatalog-database kroger_mc_p --hcatalog-storage-stanza "STORED AS SEQUENCEFILE" --map-column-hive store_id=bigint,card_id=bigint,transaction_fid=bigint,prod_id=bigint --compress --m 48 --username EXA_KROUSETLP[MARKETPLACE] --password-file /user/***/oracle_pass

 Error returned when importing into HCatalog table with partitions

15/02/03 09:49:49 INFO mapreduce.Job: Job job_1422914679712_0028 failed with state FAILED due to: Job commit failed: org.apache.hive.hcatalog.common.HCatException : 2006 : Error adding partition to metastore. Cause : org.apache.hadoop.security.AccessControlException: Permission denied. user=***** is not the owner of inode=dib_time_code=20140102
  at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkOwner(DefaultAuthorizationProvider.java:169)
  at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:157)
  at org.apache.sentry.hdfs.SentryAuthorizationProvider.checkPermission(SentryAuthorizationProvider.java:174)
  at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
  at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6286)

Trying to import into a Hive table fails with

Failed with exception MetaException(message:java.lang.NullPointerException)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask


All the needed ACL's are setup as stated in http://ingest.tips/2014/12/25/sqoop-import-in-a-world-governed-by-sentry-2/ and http://blog.cloudera.com/blog/2015/01/new-in-cdh-5-3-apache-sentry-integration-with-hdfs/ .

Ensured read and write on the target directory can be performed as the user executing the sqoop command but still error persists.

Cause

Sign In with your My Oracle Support account

Don't have a My Oracle Support account? Click to get started

My Oracle Support provides customers with access to over a
Million Knowledge Articles and hundreds of Community platforms