Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

JQ: Create CSV out of aggregated json

I'm trying to parse an aggregated search result of elasticsearch with jq to create a CSV. But it's really hard to get the result i need - hope someone can help. I have the following json:

[
  {
    "key_as_string": "2017-09-01T00:00:00.000+02:00",
    "key": 1506808800000,
    "doc_count": 5628,
    "agg1": {
      "doc_count_error_upper_bound": 5,
      "sum_other_doc_count": 1193,
      "buckets": [
        {
          "key": "value3",
          "doc_count": 3469,
          "agg2": {
            "doc_count_error_upper_bound": 1,
            "sum_other_doc_count": 3459,
            "buckets": [
              {
                "key": "10367.xxx",
                "doc_count": 1
              },
              {
                "key": "10997.xxx",
                "doc_count": 1
              },
              {
                "key": "12055.xxx",
                "doc_count": 1
              },
              {
                "key": "12157.xxx",
                "doc_count": 1
              },
              {
                "key": "12435.xxx",
                "doc_count": 1
              },
              {
                "key": "12volt.xxx",
                "doc_count": 1
              },
              {
                "key": "13158.xxx",
                "doc_count": 1
              },
              {
                "key": "13507.xxx",
                "doc_count": 1
              },
              {
                "key": "13597.xxx",
                "doc_count": 1
              },
              {
                "key": "137.xxx",
                "doc_count": 1
              }
            ]
          }
        },
        {
          "key": "value2",
          "doc_count": 608,
          "agg2": {
            "doc_count_error_upper_bound": 0,
            "sum_other_doc_count": 577,
            "buckets": [
              {
                "key": "saasf.xxx",
                "doc_count": 7
              },
              {
                "key": "asfasf.xxx",
                "doc_count": 5
              },
              {
                "key": "sasfsd.xxx",
                "doc_count": 3
              },
              {
                "key": "werwer.xxx",
                "doc_count": 3
              },
              {
                "key": "werwre.xxx",
                "doc_count": 3
              },
              {
                "key": "a-werwr.xxx",
                "doc_count": 2
              },
              {
                "key": "aef.xxx",
                "doc_count": 2
              },
              {
                "key": "sadhdhh.xxx",
                "doc_count": 2
              },
              {
                "key": "dhsdfsdg.xxx",
                "doc_count": 2
              },
              {
                "key": "ertetrt.xxx",
                "doc_count": 2
              }
            ]
          }
        },
        {
          "key": "value1",
          "doc_count": 358,
          "agg2": {
            "doc_count_error_upper_bound": 0,
            "sum_other_doc_count": 336,
            "buckets": [
              {
                "key": "fhshfg.xxx",
                "doc_count": 3
              },
              {
                "key": "sgh.xxx",
                "doc_count": 3
              },
              {
                "key": "12.xxx",
                "doc_count": 2
              },
              {
                "key": "sbgs.xxx",
                "doc_count": 2
              },
              {
                "key": "dp-eca.xxx",
                "doc_count": 2
              },
              {
                "key": "ztuhfb.xxx",
                "doc_count": 2
              },
              {
                "key": "javascript.xxx",
                "doc_count": 2
              },
              {
                "key": "koi-fdhfh.xxx",
                "doc_count": 2
              },
              {
                "key": "sdfh.xxx",
                "doc_count": 2
              },
              {
                "key": "etz5.xxx",
                "doc_count": 2
              }
            ]
          }
        }
      ]
    }
  }
]

This is just a little snipped, in reality i have these result for every single day (-> timestamp is located in 'key_as_string'). However, i need a csv which give me the following result:

2017-09-01T00:00:00.000+02:00,value3,10367.xxx,1
2017-09-01T00:00:00.000+02:00,value3,10997.xxx,1
...
2017-09-01T00:00:00.000+02:00,value2,saasf.xxx,7
2017-09-01T00:00:00.000+02:00,value2,asfasf.xxx,5
...
2017-09-01T00:00:00.000+02:00,value1,fhshfg.xxx,3
2017-09-01T00:00:00.000+02:00,value1,sgh.xxx,3
..
like image 708
Thomas S. Avatar asked Nov 29 '25 16:11

Thomas S.


1 Answers

jq solution:

jq -r '.[] | .key_as_string as $ks | .agg1.buckets[] | .key as $key 
           | .agg2.buckets[] | [$ks,$key,.key,.doc_count] | @csv' jsonfile

The output (for your current input):

"2017-09-01T00:00:00.000+02:00","value3","10367.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","10997.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","12055.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","12157.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","12435.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","12volt.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","13158.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","13507.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","13597.xxx",1
"2017-09-01T00:00:00.000+02:00","value3","137.xxx",1
"2017-09-01T00:00:00.000+02:00","value2","saasf.xxx",7
"2017-09-01T00:00:00.000+02:00","value2","asfasf.xxx",5
"2017-09-01T00:00:00.000+02:00","value2","sasfsd.xxx",3
"2017-09-01T00:00:00.000+02:00","value2","werwer.xxx",3
"2017-09-01T00:00:00.000+02:00","value2","werwre.xxx",3
"2017-09-01T00:00:00.000+02:00","value2","a-werwr.xxx",2
"2017-09-01T00:00:00.000+02:00","value2","aef.xxx",2
"2017-09-01T00:00:00.000+02:00","value2","sadhdhh.xxx",2
"2017-09-01T00:00:00.000+02:00","value2","dhsdfsdg.xxx",2
"2017-09-01T00:00:00.000+02:00","value2","ertetrt.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","fhshfg.xxx",3
"2017-09-01T00:00:00.000+02:00","value1","sgh.xxx",3
"2017-09-01T00:00:00.000+02:00","value1","12.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","sbgs.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","dp-eca.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","ztuhfb.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","javascript.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","koi-fdhfh.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","sdfh.xxx",2
"2017-09-01T00:00:00.000+02:00","value1","etz5.xxx",2
like image 54
RomanPerekhrest Avatar answered Dec 02 '25 06:12

RomanPerekhrest



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!