My Oracle Support Banner

Target Component Error In Published Pipeline (Doc ID 2902329.1)

Last updated on DECEMBER 15, 2023

Applies to:

Oracle Stream Analytics - Version 19.1.0.0.6 and later
Information in this document applies to any platform.

Symptoms

Customer has created a simple pipeline in Oracle Golden Gate on OCI

This pipeline read from a Kafka stream and write all messages in db table on ADW (deployed on the same OCI).


The stream comes from a Kafka Topic written by Oracle Golden Gate, so there are standard fields at the top of the message.


In the stream, all fields are mapped as Text field and also in ADW table all fields have 'VARCHAR2' datatype.

When they were publishing the pipeline they verified a change of behavior in loading data to the target db based on the setting of various parameters.


In particular when they used use the default parameters in the following way:
- EXECUTOR COUNT = 1
- CORES PER EXECUTOR = 2
Data entry occurs correctly but the pipeline is killed later (after severals hours, about 10, probably due to workload).

To solve this problem, they decided to change the application parameters to increase the resources.
- EXECUTOR COUNT = 2
- CORES PER EXECUTOR = 4
In this case the pipeline is not killed but the data entry is not correct: the values do not match the messages in the Kafka input stream.


Cause

To view full details, sign in with your My Oracle Support account.

Don't have a My Oracle Support account? Click to get started!


In this Document
Symptoms
Cause
Solution
References


My Oracle Support provides customers with access to over a million knowledge articles and a vibrant support community of peers and Oracle experts.