Skip to main content

json_parser

The json_parser processor attempts to transform the buffer data to json Uses the _dead_letter_queue options to handle parsing errors which are none (ignore), log, throw or sends bad docs to a kafka topic specified in the api property of the job. see dead letter queue and kafka_dead_letter for dead letter queue details.

Usage

JSON Parse raw records

Example of a job using the json_parser processor

{
"name" : "testing",
"workers" : 1,
"slicers" : 1,
"lifecycle" : "once",
"assets" : [
"standard"
],
"operations" : [
{
"_op": "test-reader"
},
{
"_op": "json_parser",
"source": "name",
"destination": "name_again"
}
]
}

Example of the data and the expected results

const data = [
DataEntity.make({}, { _key: '1' }),
DataEntity.make({}, { _key: '2' }),
DataEntity.make({}, { _key: '3' }),
];

data[0].setRawData(Buffer.from(JSON.stringify({ id: 1 }), 'utf-8'));
data[1].setRawData(Buffer.from(JSON.stringify({ id: 2 }), 'utf-8'));
data[2].setRawData(Buffer.from(JSON.stringify({ id: 3 }), 'utf-8'));

const results = await processor.run(data);
[
DataEntity.make({ id: 1 });
DataEntity.make({ id: 2 });
DataEntity.make({ id: 3 });
]

Parameters

ConfigurationDescriptionTypeNotes
_opName of operation, it must reflect the exact name of the fileStringrequired
_dead_letter_actionaction to take if a doc can not be transformed to JSON; accepts none, throw, log, or an api nameStringrequired, defaults to 'log'