-1

I have multiple URL's like 'https://static.nseindia.com/s3fs-public/2022-09/ind_prs01092022.pdf' and I want to loop through an array of these and download them to a local folder. I saw that I may need to use s3fs, but I am unsure what the bucket name should be. (download file using s3fs)

2
  • Do you know the URLs anyhow? If so, you can just open a requests.Session and get the files one by one in a loop or in parallel using aiohttp. Commented Sep 13, 2022 at 9:13
  • @StSav012 I tried that. Just times out Commented Sep 14, 2022 at 6:37

1 Answer 1

0

It appears the web server doesn't respond unless a user agent is among the request headers. The behavior is fairly common.

import requests

with requests.Session() as s:
    s.get('https://static.nseindia.com/s3fs-public/2022-09/ind_prs01092022.pdf',
           headers={'User-Agent': 'Python'}  # or any non-empty string
    )
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.