1

I am having an issue passing an array member to another script. I have a VM build script that pulls from a CSV, I end up with a $VM object with .Name, .CPU, .RAM, .IP, etc. I want to pass that VM object to another script (inside the new server) which can then act on it, but am unable to do so. I have been testing the correct syntax just to pass a simple array, like below, but am still not successful:

CSV:

Name,NumCPU,MemoryGB,IPAddress
JLTest01,2,4,172.24.16.25

Script1:

Function TestMe {
[CmdLetBinding()]
Param (
  [Parameter(Mandatory, Position=1)]
     [array]$arr
)

$arr | Out-GridView
}

TestMe

Calling Script:

$aVMs = Import-Csv -Path "PathToCsv"

foreach($VM in $aVMs) {  
   $command = "<path>\TestMe.ps1 " + "-arr $($VM)"
   Invoke-Expression $command
}

This produces an error, which seems to be on parsing the array. The error states:

The term 'JLTest01' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again. At line:1 char:48 + ... \Desktop\TestMe.ps1 -arr @{Name=JLTest01; NumCPU ...

Just trying to figure out what I am doing wrong exactly, and what I need to do to pass the object to the second script.

1 Answer 1

1

Don't use Invoke-Expression (which is rarely the right tool and should generally be avoided for security reasons):

The stringification of the custom objects output by Import-Csv performed by $($VM) does not preserve the original objects; it results in a hashtable-like representation that isn't suitable for programmatic processing and breaks the syntax of the command line you're passing to Invoke-Expression.

Instead, just invoke your script directly:

$aVMs = Import-Csv -Path "PathToCsv"

.\TestMe.ps1 -arr $aVMs

Note that I'm passing $aVMs as a whole to your script, given that your -arr parameter is array-typed.

If you'd rather process the objects one by one, stick with the foreach approach (but then you should declare the type of your $arr parameter as [pscustomobject] rather than [array]):

$aVMs = Import-Csv -Path "PathToCsv"

foreach ($VM in $aVMs) {
  .\TestMe.ps1 -arr $VMs
}

Another option is to declare $arr as accepting pipeline input, add a process block to your script, and then pipe $aVMs to your script ($aVMs | .\TestMe.ps1).


Also, don't nest a function of the same name inside your .ps1 script that you call from within the script, especially not without passing the arguments through; scripts can directly declare parameters, just like functions:

[CmdLetBinding()]
Param (
  [Parameter(Mandatory, Position=1)]
  [array]$arr
)

$arr | Out-GridView
Sign up to request clarification or add additional context in comments.

4 Comments

I'd dot source the executive script as . .\Script1.ps1 and then call directly the function as TestMe -arr $aVMs. Moreover, remove the latter line TestMe from Script1.ps1 script.
Good point, @JosefZ, I hadn't paid attention to the content of the script; the simplest solution is to not nest a function inside the script and make it accept parameters directly - answer updated.
Thanks for the suggestions, I appreciate your help. Making the changes you suggested works and passes the object as expected. My only concern is how it will behave if I pass the whole $aVMs array and there are multiple $VM objects within it, but this is a start. If nothing else I can parse the array inside the executive script to ensure I am only targeting the items I need.
@JLogan3o13, you can stick with the foreach approach if you want to, but then you should declare the type of your $arr parameter as [pscustomobject] rather than [array] - please see my update.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.