0

I'd like to read a csv file from disk into a, []map[string]string datatype. Where the []slice is the line number and map["key"] is the header (line 1) of the csv file.

I could not find anything in the standard library to accomplish this.

1
  • 4
    No. CSV data is structured as a slice (rows) of slices (columns), and that's how encoding/csv treats it. You can see the entire functionality in the documentation. If your data has an initial row of headers (or column of headers), you can transform the data as you see fit in your code. Commented Jul 18, 2019 at 20:25

2 Answers 2

7

Based on reply, it sounds like there is nothing in the standard libraries, like ioutil, to read a csv file into a map.

The following function given a path to a csv file will convert it into a slice of map[string]string.

Update: based on a comment I decided to provide my CSVFileToMap() and MapToCSV() func that writes the map back to a csv file.

    package main

    import (
        "os"
    "encoding/csv"
        "fmt"
    "strings"
    )



    // CSVFileToMap  reads csv file into slice of map
    // slice is the line number
    // map[string]string where key is column name
    func CSVFileToMap(filePath string) (returnMap []map[string]string, err error) {



        // read csv file
        csvfile, err := os.Open(filePath)
        if err != nil {
            return nil, fmt.Errorf(err.Error())
        }

        defer csvfile.Close()

        reader := csv.NewReader(csvfile)

        rawCSVdata, err := reader.ReadAll()
        if err != nil {
            return nil, fmt.Errorf(err.Error())
        }

        header := []string{} // holds first row (header)
        for lineNum, record := range rawCSVdata {

            // for first row, build the header slice
            if lineNum == 0 {
                for i := 0; i < len(record); i++ {
                    header = append(header, strings.TrimSpace(record[i]))
                }
            } else {
                // for each cell, map[string]string k=header v=value
                line := map[string]string{}
                for i := 0; i < len(record); i++ {
                    line[header[i]] = record[i]
                }
                returnMap = append(returnMap, line)
            }
        }

        return
    }



    // MapToCSVFile  writes slice of map into csv file
    // filterFields filters to only the fields in the slice, and maintains order when writing to file
    func MapToCSVFile(inputSliceMap []map[string]string, filePath string, filterFields []string) (err error) {

        var headers []string  // slice of each header field
        var line []string     // slice of each line field
        var csvLine string    // string of line converted to csv
        var CSVContent string // final output of csv containing header and lines

        // iter over slice to get all possible keys (csv header) in the maps
        // using empty Map[string]struct{} to get UNIQUE Keys; no value needed
        var headerMap = make(map[string]struct{})
        for _, record := range inputSliceMap {
            for k, _ := range record {
                headerMap[k] = struct{}{}
            }
        }

        // convert unique headersMap to slice
        for headerValue, _ := range headerMap {
            headers = append(headers, headerValue)
        }

        // filter to filteredFields and maintain order
        var filteredHeaders []string
        if len(filterFields) > 0 {
            for _, filterField := range filterFields {
                for _, headerValue := range headers {
                    if filterField == headerValue {
                        filteredHeaders = append(filteredHeaders, headerValue)
                    }
                }
            }
        } else {
            filteredHeaders = append(filteredHeaders, headers...)
            sort.Strings(filteredHeaders) // alpha sort headers
        }

        // write headers as the first line
        csvLine, _ = WriteAsCSV(filteredHeaders)
        CSVContent += csvLine + "\n"

        // iter over inputSliceMap to get values for each map
        // maintain order provided in header slice
        // write to csv
        for _, record := range inputSliceMap {
            line = []string{}

            // lines
            for k, _ := range filteredHeaders {
                line = append(line, record[filteredHeaders[k]])
            }
            csvLine, _ = WriteAsCSV(line)
            CSVContent += csvLine + "\n"
        }

        // make the dir incase it's not there
        err = os.MkdirAll(filepath.Dir(filePath), os.ModePerm)
        if err != nil {
            return err
        }

        // write out the csv contents to file
        ioutil.WriteFile(filePath, []byte(CSVContent), os.FileMode(0644))
        if err != nil {
            return err
        }

        return
    }

    func WriteAsCSV(vals []string) (string, error) {
        b := &bytes.Buffer{}
        w := csv.NewWriter(b)
        err := w.Write(vals)
        if err != nil {
            return "", err
        }
        w.Flush()
        return strings.TrimSuffix(b.String(), "\n"), nil
    }

Finally, here is a test case to show it's usage:

    func TestMapToCSVFile(t *testing.T) {
    // note: test case requires the file ExistingCSVFile exist on disk with a 
    // few rows of csv data
        SomeKey := "some_column"
        ValueForKey := "some_value"
        OutputCSVFile := `.\someFile.csv`
        ExistingCSVFile := `.\someExistingFile.csv`

        // read csv file
        InputCSVSliceMap, err := CSVFileToMap(ExistingCSVFile)
        if err != nil {
            t.Fatalf("MapToCSVFile() failed %v", err)
        }

        // add a field in the middle of csv
        InputCSVSliceMap[2][SomeKey] = ValueForKey // add a new column name 
        "some_key" with a value of "some_value" to the second line. 

        err = MapToCSVFile(InputCSVSliceMap, OutputReport, nil)
        if err != nil {
            t.Fatalf("MapToCSVFile() failed writing outputReport %v", err)
        }

        // VALIDATION: check that Key field is present in MapToCSVFile output file
        // read Output csv file
        OutputCSVSliceMap, err := CSVFileToMap(OutputCSVFile)
        if err != nil {
            t.Fatalf("MapToCSVFile() failed reading output file %v", err)
        }

        // check that the added key has a value for Key
        if OutputCSVSliceMap[2][SomeKey] != ValueForKey {
            t.Fatalf("MapToCSVFile() expected row to contains key value: %v", ValueForKey)
        }
    }
Sign up to request clarification or add additional context in comments.

3 Comments

I just wanted to comment and say I have no idea why anyone downvoted you so much. I was looking for something to do exactly this. I've found the Go community can be pretty hostile and I'm not really sure why
@DBA108642 I updated the code with my latest and added the corresponding func to write back to file. hope it helps
Sadly the Go community does tend to become very hostile if you ever dare to suggest that something could be improved in the language. I really like the language itself and what it tries to achieve but the community really lets it down sometimes!
0

It is disappointing that the Go csv package does not contain anything as useful as the csv.DictReader we have in Python (https://docs.python.org/3/library/csv.html#csv.DictReader).

However, I have found the csvutil package (https://github.com/jszwec/csvutil) very useful. It doesn't seem to support unmarshalling records to map[string]string but you can unmarshal directly to a struct type, which IMO is more useful.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.