Hi,
I have a goal to convert a large JSON (>10k rows) into excel file row+2 columns.
The JSON looks something like this:
{
"data": [
{
"field1": "Value1", "field2": "Value2", "field3": "Value3",
"field4": "Value4", "field5": "Value5", "field6": "Value6"
},
{
"field1": "Value1", "field2": "Value2", "field3": "Value3",
"field4": "Value4", "field5": "Value5", "field6": "Value6"
},
.....
]
}
The excel should be something like this:
Row1 ColA=value1,value2,value3 ColB=value4,value5,value6
Row2 ColA=value1,value2,value3 ColB=value4,value5,value6
....
RowN ColA=value1,value2,value3 ColB=value4,value5,value6
Currently, I am doing this:
- convert json into custom object
- loop through custom object array to extract fields 1 - 6 and append into a list variable
- for each ROW in list
- for each COL in ROW
- write to excel
- end
-end
It works for small JSON payload. For large payload, it just takes too long to complete.
Is there simpler way?