1

I have created a PsCustomObject, when the variable is called is ISE, it reads a table of the relevant data. However, if I try to compare the PsCustomObject with another object, the PsCustomObject doesn't read correctly. I'd like to tell the script if any of the lines in the existing CSV match the PSCustomObject do not export the data to the CSV, in other words skip duplicate rows in the CSV file. The CSV may or may not have multiple rows.

$fileInfo = @(
                        [pscustomobject]@{
                            user_id = $user
                            studio = $studio
                            function = $Task
                            end_time_local = $creationTime
                            asin = $ASIN
                            variant = $variant
                            process_class_id = $processClass
                            }
                           )
$currentData = Import-Csv "$scansFolder\$fileName.csv"
if($fileInfo -ne $currentData){
$fileInfo | Export-Csv "$scansFolder\$fileName.csv" -Append -NoTypeInformation -Force
}
5
  • 1
    what is the structure of the imported CSV? if they are the same, then use the Compare-Object cmdlet. Commented Apr 15, 2019 at 13:45
  • 1
    Your $fileInfo variable is an array. Do you need to support multiple custom objects to exclude from the CSV? Commented Apr 15, 2019 at 14:10
  • @Lee_Dailey The csv is the same as the custom object, with headers and the 7 items mentioned in $fileInfo. Commented Apr 15, 2019 at 14:16
  • @mklement0 I think that may be the problem, i tried something like the following with no luck. foreach($line in $currentData){ if($fileInfo -ne $currentData){ $fileInfo | Export-Csv "$scansFolder\$fileName.csv" -Append -NoTypeInformation -Force }} Any ideas on how to do that? Commented Apr 15, 2019 at 14:18
  • @Garrett - you can use the Compare-Object cmdlet to find out if ONE object is in a collection of similar objects. take a look at the -IncludeEqual parameter. Commented Apr 15, 2019 at 14:25

1 Answer 1

2

[pscustomobject] is a .NET reference type, so comparing two instances[1] with -eq will test for reference equality (identity), i.e. if the two instances are one and the same object[2] - which is obviously not the case in your scenario.

Assuming that the properties of your custom objects are instances of value types or strings (which appears to be the case), you can use Compare-Object to compare objects by their property values, with the ability to compare two collections:

$fileInfo = @(
  [pscustomobject]@{
      user_id = $user
      studio = $studio
      function = $Task
      end_time_local = $creationTime
      asin = $ASIN
      variant = $variant
      process_class_id = $processClass
      }
)

# Get the property names.
# This assumes that the CSV data has (at least) the same
# set of properties (columns).
$propNames = $fileInfo[0].psobject.properties.Name

$currentData = Import-Csv "$scansFolder\$fileName.csv"

# Compare the $fileInfo custom object(s) to the custom objects read
# from the CSV file and only export those that are unique to the RHS ('=>')
# back to the file, i.e., those that don't match $fileInfo.
Compare-Object -Property $propNames $fileInfo $currentData |
  Where-Object SideIndicator -eq '=>' | Select-Object InputObject | 
    Export-Csv "$scansFolder\$fileName.csv" -Append -NoTypeInformation -Force

[1] Import-Csv outputs [pscustomobject] instances too.

[2] See the Equality Comparison help topic (written for C#, but applies analogously to PowerShell's -eq operator).

Sign up to request clarification or add additional context in comments.

8 Comments

@Garrett: So the property names match. I'm unclear on how you're planning to check for duplicates: your question uses a [pscustomobject] literal to compare against the CSV data, and that's what my answer shows you how to do. Are you saying that $fileInfo comes from a different file? Or do you need to weed out duplicates inside a given file? How do you determine what's new?
@Garrett, I'm still not sure I fully understand, but you can read multiple CSV files into $fileInfo with a single Import-Csv call, using something like Import-Csv -LiteralPath (Get-Item *.csv) (using string *.csv may not work, due this this bug), and compare that to the objects in the reference CSV file read into $currentData via Compare-Object, as shown.
@Garrett, you're probably better off keeping track persistently of which files you've already processed, especially as the number of files grows larger, along with the output CSV file growing. Apart from that, what I recommended in my previous comment should work: read all input CSV files into $fileInfo - $fileInfo = Import-Csv -LiteralPath (Get-Item c:\path\to\inputs\*.csv) - then use the Compare-Object call against $currentData, as shown in the answer.
Generally, I suggest we bring closure to this question: my answer addresses your question as asked, so I suggest you accept it; if there are follow-up questions, please create a new question (feel free to ping me here once you've done so).
Based on above suggestions, im going a new route with this script. I believe mklement0 answered the original question.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.