14

Is bash capable of handling extracting rows and columns from csv files? Hoping I don't have to resort to python..

My 5-column csv file looks like:

Rank,Name,School,Major,Year
1,John,Harvard,Computer Science,3
2,Bill,Yale,Political Science,4
3,Mark,Stanford,Biology,1
4,Jane,Princeton,Electrical Engineering,3
5,Alex,MIT,Management Economics,2

I only want to extract the 3rd, 4th, and 5th column contents, ignoring the first row, so output looks like:

Harvard,Computer Science,3
Yale,Political Science,4
Stanford,Biology,1
Princeton,Electrical Engineering,3
MIT,Management Economics,2

So far I can only get awk to print out either each row, or each column of my CSV file, but not specific cols/rows like this case! Can bash do this?

1
  • it's odd that you're struggling to get awk to do this since printing fields (columns) and rows (records) is the most basic thing awk that is designed to do. Makes me think there must be more to this than you've described so far.... Commented Jan 24, 2013 at 6:28

11 Answers 11

19
awk -F, 'NR > 1 { print $3 "," $4 "," $5 }' 

NR is the current line number, while $3, $4 and $5 are the fields separated by the string given to -F

Sign up to request clarification or add additional context in comments.

1 Comment

You can set OFS=',' so that you don't have to concatenate commas in the print.
9

Try this:

tail -n+2 file.csv | cut --delimiter=, -f3-5

1 Comment

Most simple and elegant solution yet.
6

Bash solutions;

Using IFS

#!/bin/bash
while IFS=',' read -r rank name school major year; do
    echo -e "Rank\t: $rank\nName\t: $name\nSchool\t: $school\nMajor\t: $major\nYear\t: $year\n"
done < file.csv
IFS=$' \t\n'

Using String Manipulation and Arrays

#!/bin/bash
declare -a arr
while read -r line; do
    arr=(${line//,/ })
    printf "Rank\t: %s\nName\t: %s\nSchool\t: %s\nMajor\t: %s\nYear\t: %s\n" ${arr[@]}
done < file.csv

2 Comments

Fairly unwieldy, but I like the use of arrays which I will probably refer to again at some point. Not to mention it's a bash-only solution.
this fails to ignore commas in quotes. example csv line: "some, text",1,2 will be parsed as: some, text, 1, 2 instead of some text, 1, 2
6

Use cut and tail:

tail -n +2 file.txt | cut -d ',' -f 3-

1 Comment

The OP wanted to skip the first line, that's why we used tail.
3
sed 1d file.csv | while IFS=, read first second rest; do echo "$rest"; done

Comments

2

Here you go, a simple AWK program.

#!/usr/bin/awk -f

BEGIN {
    # set field separator to comma to split CSV fields
    FS = ","
}

# NR > 1 skips the first line
NR > 1 {
    # print only the desired fields
    printf("%s,%s,%s\n", $3, $4, $5)
}

1 Comment

If you set OFS=",", you can simply write print $3, $4, $5
2
perl -F, -lane 'if($.!=1){print join ",",@F[2,3,4];}' your_file

check here

Comments

2

This might work for you (GNU sed):

sed -r '1d;s/([^,]*,){2}//' file

Comments

2

try this

awk -F, 'NR > 1 { OFS=",";print $3, $4, $5 }' temp.txt

or this

sed -re '1d;s/^[0-9],\w+,//g' temp.txt

2 Comments

Can you provide some explanation of what your fixes are doing.
@JonEgerton , in the awk i added the OFS and in sed i made more clearer so that the new users can see what i am matching . in previous answers regex are shhort but hard to comprehend for new user of regex. mine may not be perfect but at least are visible what they are doing. and they work
1

I have created package for this kind of tasks - gumba If you feel comfortable with coffeescript you can give it a try

cat file.csv | tail -n +2 | \
gumba "words(',').take((words)-> words.last(3)).join(',')"`

Comments

0
grep '^,' outlook.contacts.csv | sed 's/^,\([^,]*\),[^,]*,\([^,]*\),.*/\1 \2/'

Get all lines that starts with a , then using sed to replace blank fields with first and second name.

Be careful for some reason once you paste it changes the line to this so maybe you better to carefully do it manually.

grep '^,' outlook.contacts.csv | sed 's/^,([^,]),[^,],([^,]),./\1 \2/'

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.