I have a json file as input to my Elasticsearch 7.10.1 cluster. The format of the json is something like:
{
"data" : "eyJtZXRyaWNfc3RyZWFtX25hbWUiOiJtGltZW5zaW9ucy...
}
The data value in the json is a base64 of a json. How can I create a map in elasticsearch to decode the base64 value and make index on each field inside the decoded json?
Ingest pipeline to the rescue!! You can create an ingest pipeline that will decode the base64 encoded field and then parse the resulting JSON and added all fields to the document. It basically goes like this:
PUT _ingest/pipeline/b64-decode
{
"processors": [
{
"script": {
"source": "ctx.decoded = ctx.b64.decodeBase64();"
}
},
{
"json": {
"field": "decoded",
"add_to_root": true
}
},
{
"remove": {
"field": "decoded"
}
}
]
}
Then you can refer to that ingest pipeline when indexing new documents, as shown below:
PUT index/_doc/1?pipeline=b64-decode
{
"b64": "eyJmaWVsZCI6ICJoZWxsbyB3b3JsZCJ9"
}
The b64
field contains the following base64-encoded JSON
{ "field" : "hello world" }
Finally, the document that will be indexed will look like this:
{
"b64" : "eyJmaWVsZCI6ICJoZWxsbyB3b3JsZCJ9",
"field" : "hello world"
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With