To convert JSON data to a CSV file in Go, you need to create a new struct for JSON data, then decode the JSON file into an array of these structs, and finally save the data from this array as subsequent rows of the CSV file. The two main packages necessary to do this are encoding/json to decode the JSON data with the json.Decoder and encoding/csv to write the output CSV data using csv.Writer.
packagemainimport("encoding/csv""encoding/json""fmt""log""os")typeFruitAndVegetableRankstruct{// 1. Create a new struct for storing read JSON objectsVegetablestring`json:"vegetable"`Fruitstring`json:"fruit"`Rankint64`json:"rank"`}funcconvertJSONToCSV(source,destinationstring)error{// 2. Read the JSON file into the struct arraysourceFile,err:=os.Open(source)iferr!=nil{returnerr}// remember to close the file at the end of the functiondefersourceFile.Close()varranking[]FruitAndVegetableRankiferr:=json.NewDecoder(sourceFile).Decode(&ranking);err!=nil{returnerr}// 3. Create a new file to store CSV dataoutputFile,err:=os.Create(destination)iferr!=nil{returnerr}deferoutputFile.Close()// 4. Write the header of the CSV file and the successive rows by iterating through the JSON struct arraywriter:=csv.NewWriter(outputFile)deferwriter.Flush()header:=[]string{"vegetable","fruit","rank"}iferr:=writer.Write(header);err!=nil{returnerr}for_,r:=rangeranking{varcsvRow[]stringcsvRow=append(csvRow,r.Vegetable,r.Fruit,fmt.Sprint(r.Rank))iferr:=writer.Write(csvRow);err!=nil{returnerr}}returnnil}funcmain(){iferr:=convertJSONToCSV("data.json","data.csv");err!=nil{log.Fatal(err)}}
Create a new struct for storing read JSON objects #
11
12
13
14
15
16
typeFruitAndVegetableRankstruct{// 1. Create a new struct for storing read JSON objectsVegetablestring`json:"vegetable"`Fruitstring`json:"fruit"`Rankint64`json:"rank"`}
The first step of JSON to CSV conversion is to load the JSON data to a Go struct. So, we define a proper type, with the fields matching the data in the file and annotate them with JSON struct field tags to enable JSON decoding into that struct.
// 2. Read the JSON file into the struct arraysourceFile,err:=os.Open(source)iferr!=nil{returnerr}// remember to close the file at the end of the functiondefersourceFile.Close()varranking[]FruitAndVegetableRankiferr:=json.NewDecoder(sourceFile).Decode(&ranking);err!=nil{returnerr}
We can start processing our JSON file. We open it (remember to close the file to release resources back to the system, for example, using defer keyword) and then create a new json.Decoder with this file as an argument. Since json.NewDecoder(r io.Reader) requires io.Reader, we do not need to read the content of the file beforehand. If we were to use the json.Unmarshal() function, it would be necessary. With Decode() method, we read the JSON file and convert it to the slice of FruitAndVegetableRank objects.
// 3. Create a new file to store CSV dataoutputFile,err:=os.Create(destination)iferr!=nil{returnerr}deferoutputFile.Close()
The CSV data will be saved to a file, so in this step, we create a new destination file in a pretty standard way.
Write the header of the CSV file and the successive rows by iterating through the JSON struct array #
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
// 4. Write the header of the CSV file and the successive rows by iterating through the JSON struct arraywriter:=csv.NewWriter(outputFile)deferwriter.Flush()header:=[]string{"vegetable","fruit","rank"}iferr:=writer.Write(header);err!=nil{returnerr}for_,r:=rangeranking{varcsvRow[]stringcsvRow=append(csvRow,r.Vegetable,r.Fruit,fmt.Sprint(r.Rank))iferr:=writer.Write(csvRow);err!=nil{returnerr}}
As the last step, we create a new csv.Writer that writes the data in CSV format to the output file. Remember to call writer.Flush to ensure that all the buffered content is written before the function finishes. The writing process consists of iterating through the array of FruitAndVegetableRank objects and making a CSV row for each of them. Then, This row is saved using writer.Write() method. In the example, we also wrote the header row as the first line of the file.