Unable to Set Queue for Sqoop Hcatalog Job Imports on BDA 4.x (Doc ID 2003305.1)

Last updated on APRIL 26, 2015

Applies to:

Big Data Appliance Integrated Software - Version 4.1.0 and later
Linux x86-64

Symptoms

Prior to BDA 4.x,  -Dmapred.job.queue.name= can be used in the sqoop command to specify a pool for the job to run in. However in BDA 4.x, the -Dmapred.job.queue.name value is getting ignored and the jobs  run in the default pool. 

Exporting -Dmapred.job.queue.name= in the HADOOP_OPTS environment variable has no effect.

For example, the command ingors -Dmapred.job.queue.name=.

sqoop import -Dmapred.job.queue.name=MC-XXX_NAME -Dmapreduce.output.fileoutputformat.compress=true -Dmapreduce.output.fileoutputformat.compress.type=BLOCK -Dmapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.SnappyCodec -Doraoop.chunk.method=ROWID -Doraoop.timestamp.string=false -Doraoop.import.partitions=SYS_P4385 --direct --connect jdbc:oracle:thin:@//user.domain.com:1521/username.domain.com --table TABNAME.XXXXXX --map-column-hive TRANSACTION_ITEM_FID=bigint,TRANSACTION_FID=bigint,PROD_ID=bigint,PROD_HIST_ID=bigint,STORE_ID=bigint,STORE_HIST_ID=bigint,CARD_ID=bigint,CARD_HIST_ID=bigint,DATE_ID=bigint,TIME_ID=bigint,CURRENCY_ID=bigint,TERMINAL_ID=bigint,TERMINAL_HIST_ID=bigint,PROMO_CYCLE_ID=bigint,TRANSACTION_DTTM=timestamp,LOAD_ID=bigint,CREATED_DTTM=timestamp,MODIFIED_DTTM=timestamp,BASKET_SAMPLE_SEED_NUM=tinyint,LOAD_FILE_DATE=timestamp --m 48 --create-hcatalog-table --hcatalog-database kroger_mc_p --hcatalog-table transaction_item_fct_mc_raw_xxxxx --compression-codec org.apache.hadoop.io.compress.SnappyCodec --compress --delete-target-dir --username USERNAME[NAME] --password-file /user/xxxx/xxxx



Cause

Sign In with your My Oracle Support account

Don't have a My Oracle Support account? Click to get started

My Oracle Support provides customers with access to over a
Million Knowledge Articles and hundreds of Community platforms