python - kafka-connect: mapper_parsing_exception with elasticsearch connector -


i using elasticsearch-kafka-connector avro schema , pushing data topic. trying push data elasticsearch in batches, have list of dictionaries containing key: value per defined in schema. also, using confluent-kafka-python python client kafka-connect

the avro schema looks like:

{ "name": "dictarray", "type": "array", "items": {     "type": "record",     "name": "log_fields",     "fields": [         {"name": "accept_lang", "type": "string"},             ....             ....             ....      ]   } } 

for using connect-avro-standalone schema-registry , publishing data schema.registry.url mentioned in confluent-kafka-python user guide , getting following exception:

[{"type":"mapper_parsing_exception","reason":"failed parse","caused_by":{"type":"not_x_content_exception","reason":"compressor detection can called on xcontent bytes or compressed xcontent bytes"}}] 

need here...!!!


Comments

Popular posts from this blog

'hasOwnProperty' in javascript -

python - ValueError: No axis named 1 for object type <class 'pandas.core.series.Series'> -

java - How to provide dependency injections in Eclipse RCP 3.x? -