0

I have a basic bat script that uses [curl]( https://curl.haxx.se/). The script takes values from a txt document named location_ids.txt (this file is found in the same folder as the script). I have set it up to check the location_id with 3 different urls. It works well. However, it is very slow! since batch files aren't threaded, and each command will block until it completes. I am aware this can be done with more ease and speed using a powershell script (windows environment) using Invoke-RestMethod. How can I replicate the below in ps? I would like to make the curl calls simultaneous.

@echo off
setlocal enabledelayedexpansion
set OUTPUT_FILE=%DATE:/=-%@%TIME::=-%
set OUTPUT_FILE=file_%OUTPUT_FILE: =%.html

for /f %%i in (location_ids.txt) do (
    set LOCATION_ID=%%i
    echo ^<h2 style='color:green;'^>!LOCATION_ID::=!^</h2^> >>%OUTPUT_FILE%
    curl -k -H "application/x-www-form-urlencoded" -X GET -d "id=!LOCATION_ID::=!" http://localhost:5000/location_site1 >>%OUTPUT_FILE% 
    curl -k -H "application/x-www-form-urlencoded" -X GET -d "id=!LOCATION_ID::=!" http://localhost:5000/location_site2 >>%OUTPUT_FILE%
    curl -k -H "application/x-www-form-urlencoded" -X GET -d "id=!LOCATION_ID::=!" http://localhost:5000/location_site3 >>%OUTPUT_FILE%
    echo ^<br^>^<br^> >>%OUTPUT_FILE%
)

EDIT:

My attempt to run multiple web server calls to http://localhost:5000/location_site1 that run simultaneously using scriptblock. The below is not outputting any results.

$runRoutePost =
{ param([string]$id, [string]$fileLocation)
    Write-Host "Accessing site for $id";
    $ResponseData = New-Object PSObject;
    $webclient = new-object System.Net.WebClient;
    $apiParams = "id=$_";
    $ResponseData = $webclient.UploadString("http://localhost:5000/location_site1",$apiParams) |Add-Content $fileLocation;
}

Get-Content location_ids.txt | ForEach-Object {
    Start-Job -ScriptBlock $runRoutePost -ArgumentList $_, $LOG_FILE
} 

1 Answer 1

1

To convert your example, you really just need to make a request to the url and specify the location id as a query string parameter. The example below uses string interpolation to set the value of the id parameter. The $_ variable is the current item that is being enumerated within the ForEach-Object script block.

$outputFile = # not sure of your date time format 

Get-Content location_ids.txt | ForEach-Object {
    Add-Content $outputFile "<h2 style=`"color:green`">$_</h2>"
    Invoke-RestMethod -Uri "http://localhost:5000/location_site1?id=$_" | Add-Content $outputFile
    Invoke-RestMethod -Uri "http://localhost:5000/location_site2?id=$_" | Add-Content $outputFile
    Invoke-RestMethod -Uri "http://localhost:5000/location_site3?id=$_" | Add-Content $outputFile
    Add-Content $outputFile "<br><br>"
}

For a GET request you do not need to specify the content-type or method. However, if you need to for other scripts you can use the -ContentType and/or -Method parameters.

Invoke-RestMethod -Method GET -ContentType application/x-www-form-urlencoded  -Uri "http://localhost:5000/location_site3?id=$_"

More documentation can be found by running this:

get-help Invoke-RestMethod -full

Since you have a restriction of using PowerShell v2, you can use the .NET WebClient type.

$web = new-object System.Net.WebClient
Get-Content location_ids.txt | ForEach-Object {
    Add-Content $outputFile "<h2 style=`"color:green`">$_</h2>"
    $web.DownloadString("http://localhost:5000/location_site1?id=$_") | Add-Content $outputFile
    $web.DownloadString("http://localhost:5000/location_site2?id=$_") | Add-Content $outputFile
    $web.DownloadString("http://localhost:5000/location_site3?id=$_") | Add-Content $outputFile
    Add-Content $outputFile "<br><br>"
}

If instead you want to send a POST request using WebClient, the UploadString method can be used. However, in this case I'm not sure of how to set the Content-Type header.

$web.UploadString("http://localhost:5000/location_site1","id=$_") | Add-Content $outputFile

Update in response to edit

To run these jobs in parallel and collect the results, you need to wait for all the jobs to complete using Wait-Job and then extract the results using Receive-Job.

$runRoutePost = { 
    param([string]$id)

    Write-Host "Accessing site for $id"
    $webclient = new-object System.Net.WebClient
    $webclient.UploadString("http://localhost:5000/location_site1","id=$id") 
}
$Jobs = Get-Content location_ids.txt | ForEach-Object {
    Start-Job -ScriptBlock $runRoutePost -ArgumentList $_
}

Wait-Job -Job $Jobs

Receive-Job -Job $Jobs | Add-Content $LOG_FILE
Sign up to request clarification or add additional context in comments.

9 Comments

@Mary - Gotcha, in that case the .NET WebClient type can be used. I'll post an example after lunch.
You can definitely make it a POST request. I just assumed your url was setup to only accept GET based on your curl commands.
Yeah most definitely. With UploadString you are responsible for making your content match your content-type, so just be sure to separate the values with &.
I simplified your script a bit. And fixed an error in your script block where you were setting the params to id=$_.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.