python - kafka-connect: mapper_parsing_exception with elasticsearch connector -


i using elasticsearch-kafka-connector avro schema , pushing data topic. trying push data elasticsearch in batches, have list of dictionaries containing key: value per defined in schema. also, using confluent-kafka-python python client kafka-connect

the avro schema looks like:

{ "name": "dictarray", "type": "array", "items": {     "type": "record",     "name": "log_fields",     "fields": [         {"name": "accept_lang", "type": "string"},             ....             ....             ....      ]   } } 

for using connect-avro-standalone schema-registry , publishing data schema.registry.url mentioned in confluent-kafka-python user guide , getting following exception:

[{"type":"mapper_parsing_exception","reason":"failed parse","caused_by":{"type":"not_x_content_exception","reason":"compressor detection can called on xcontent bytes or compressed xcontent bytes"}}] 

need here...!!!


Comments

Popular posts from this blog

Command prompt result in label. Python 2.7 -

javascript - How do I use URL parameters to change link href on page? -

amazon web services - AWS Route53 Trying To Get Site To Resolve To www -