0

Somehow the below codes works fine for the first few test URLs within C:\testurl.txt then it hung up forever when it is processing the 4th URL from the C:\testurl.txt , no idea why it hangs up?

It is already working fine for up to 3 URLs but stuck up on 4th onward

CLS
$urllist = Get-Content "C:\testurl.txt" # URLs to test one in each line
foreach ($url in $urllist) {
    Write-Host $url

    $req = [System.Net.WebRequest]::Create($url)

    try {
        $res = $req.GetResponse()
    } catch [System.Net.WebException] {
        $res = $_.Exception.Response
    }

    $res.StatusCode
    #Print OK or whatever

    [int]$res.StatusCode
    #Print 200 or whatever
}

It is working fine for up to 3 URLs but hangs the script on 4th URL without any output or error message. Here is the example of c:\testurl.txt

http://www.google.com
http://www.google.com     
http://www.google.com
http://www.google.com
http://www.hotmail.com
http://www.gmail.com
http://www.yahoo.com
http://www.msn.com

Please note each URL will be in a new line, you will see that script will stop at (the 4th one) you may try with your own URLs, etc too

9
  • That 4th URL being ... what? Commented Aug 6, 2019 at 15:20
  • Ansgar Wiechers If you add multiple URLs to test in the file c:\testurl.txt for example: www.google.com www.hotmail.com www.gmail.com www.yahoo.com www.msn.com then you you will see that script will stop at www.yahoo.com (the 4th one) Commented Aug 6, 2019 at 15:23
  • 2
    No, it doesn't. It will throw an error for all of them if you specify just the FQDNs without protocol scheme, though. Commented Aug 6, 2019 at 15:31
  • 2
    @M-ACharlotte - PLEASE add the list to your ORIGINAL POST. [frown] Commented Aug 6, 2019 at 15:46
  • 1
    @M-ACharlotte - good! now ... put them in code format so that folks can read them ... [grin] Commented Aug 6, 2019 at 15:54

1 Answer 1

5

then it hung up forever

No - it's hung until the underlying TCP connections of the previous requests time out.

The .NET CLR will internally pool all WebRequest dispatches so that only a finite number of external requests will be initiated concurrently, and as long as you have a number of un-closed WebResponse objects in memory, your requests will start queuing up.

You can avoid this by closing them (as you should):

foreach ($url in $urllist) {
    Write-Host $url

    $req = [System.Net.WebRequest]::Create($url)

    try {
        $res = $req.GetResponse()
    } 
    catch [System.Net.WebException] {
        $res = $_.Exception.Response
    }
    finally {
        if ($res -is [System.Net.WebResponse]) {
            $res.StatusCode
            #Print OK or whatever

            [int]$res.StatusCode
            #Print 200 or whatever

            $res.Dispose()
            # close connection, dispose of response stream
        }
    }
}
Sign up to request clarification or add additional context in comments.

1 Comment

Its perfect! I missed out that Disposing of the stream and that's why it was hanging up after a few URLs, Thanks for the help!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.