Last updated on MAY 14, 2014
Applies to:Big Data Appliance Integrated Software - Version 2.2.1 and later
Information in this document applies to any platform.
An Avro file transfered to HDFS using FLUME Avro event source and HDFS sink with default avro_event serializer does not have the same data as the origianl Avro file.
The files that is written into hdfs can be inspected, via hue's file manager, as an Avro file, however the file obtained as result of the flume transfer cannot be inspected via hue's file manager as an Avro file, so it is corrupted.
The file transfered to HDFS through flume client is also not recognized as an Avro file.
Sign In with your My Oracle Support account
Don't have a My Oracle Support account? Click to get started
Million Knowledge Articles and hundreds of Community platforms