My Oracle Support Banner

ODI 12c Mapping Hive to Oracle Using Spark Technology Fails on "LKM Spark to SQL" Global with Password Error (Doc ID 2373855.1)

Last updated on FEBRUARY 03, 2019

Applies to:

Oracle Data Integrator - Version 12.1.3.0.0 and later
Information in this document applies to any platform.

Symptoms

The following error is observed in Oracle Data Integrator (ODI) 12c when running a Mapping that has an Apache Spark source Datastore, and uses "LKM Spark to SQL":

ODIKM-SPARK-SYNC-10000: EKM Command Failed with Exception: java.lang.Exception: Traceback (most recent call last):
 13 File "/tmp/MyMapping_Physical.py", line 107, in
 14 props = getSqlJDBCProps(sc)
 15 File "/tmp/MyMapping_Physical.py", line 73, in getSqlJDBCProps
 16 raise Exception( "Hadoop Credential provider does not contain entry for alias " + pwdholder + ". Spark will be unable to connect to target data server. Configure Hadoop Credential provider and re-try." )
 17 Exception: Hadoop Credential provider does not contain entry for alias odi.MyDBUserName.MyDataServerName.password. Spark will be unable to connect to target data server. Configure Hadoop Credential provider and re-try.

Changes

 

Cause

To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!


In this Document
Symptoms
Changes
Cause
Solution
 Solution 1. Update the Hadoop Credentials.
 Solution 2. Set the property hadoop.security.credential.provider.path to source or target on the Data Server in the ODI Topology.
References


My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.