0

I'm using Terminal in OS X 10.8 The challenge is to read text file (where each line is an url), get HTTP headers, save results to other text file

Tried this:

for line in cat ~/Desktop/a.txt; do curl -I line > ~/Desktop/b.txt;  done

plus multiple loop examples like

(while read l; do echo $l; done) < ~/Desktop/a.txt 

or

cat ~/Desktop/a.txt | while read CMD; do
echo $CMD
done

It seems to me that I cannot create simple loop. Please, advise. Best,

1 Answer 1

1

You can try something like this:

for i in $(cat test); do curl -I $i >> test2; done

This reads everything in the file test and appends the curl output to the file test2.

Sign up to request clarification or add additional context in comments.

2 Comments

thank you for the fast reply. Unfortunately, it does not work :(
sorry, just found out that input file was empty (because of my recent experiments probably). the error is nodename nor servname provided, or not known

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.