Why OGG Big Data Kafka fails with Error reading field 'topic_metadata': Error reading array of size only xx bytes available?
(Doc ID 2196668.1)
Last updated on NOVEMBER 07, 2019
Applies to:
Oracle GoldenGate Application Adapters - Version 12.1.2.1.6 and laterInformation in this document applies to any platform.
Goal
I am getting below error, while sending mgs to kafka, I can see topic getting created.
DEBUG 2016-10-24 13:01:04,540 [main] DEBUG (KafkaProperties.java:171) - Creating a new producer record instance for schema of the class: oracle.goldengate.handler.kafka.DefaultProducerRecord
ERROR 2016-10-24 13:01:04,691 [kafka-producer-network-thread | producer-1] ERROR (Sender.java:136) - Uncaught error in kafka producer I/O thread:
org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'topic_metadata': Error reading array of size 618343, only 37 bytes available
at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:73)
at org.apache.kafka.clients.NetworkClient.parseResponse(NetworkClient.java:380)
at org.apache.kafka.clients.NetworkClient.handleCompletedReceives(NetworkClient.java:449)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:269)
ERROR 2016-10-24 13:01:04,691 [kafka-producer-network-thread | producer-1] ERROR (Sender.java:136) - Uncaught error in kafka producer I/O thread:
org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'topic_metadata': Error reading array of size 618343, only 37 bytes available
at org.apache.kafka.common.protocol.types.Schema.read(Schema.java:73)
at org.apache.kafka.clients.NetworkClient.parseResponse(NetworkClient.java:380)
at org.apache.kafka.clients.NetworkClient.handleCompletedReceives(NetworkClient.java:449)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:269)
---------
What is causing this and how to fix?
Solution
To view full details, sign in with your My Oracle Support account. |
|
Don't have a My Oracle Support account? Click to get started! |
In this Document
Goal |
Solution |