I have code that loads an XML document, performs a $xmlDoc.SelectNodes($XPath) and then foreach($node in $nodes) pokes the XML as a string into a table.
This code works fine on files of ca. 100KB with 10 records.
However, I have a file that is ca. 100MB and ca. 50k records and the code just hangs at $xmlDoc =[xml](gc $xmlpath) (and uses all available system memory). Is there a better way to generate my array $nodes without first parsing the entire XML document?
# Loads xml document
$xmlpath = $filepath
$xmlDoc =[xml](gc $xmlpath)
$nodes = $xmlDoc.SelectNodes('//root') #One element per record in SQL
...
$SqlQuery = @"
INSERT INTO {0} VALUES ({1})
"@
....
foreach($node in $nodes)
{
$StringWriter = New-Object System.IO.StringWriter
$XmlWriter = New-Object System.XMl.XmlTextWriter $StringWriter
$XmlWriter.Formatting = "None"
$XmlWriter.Flush()
$StringWriter.Flush()
$node.WriteTo($XmlWriter)
#data content (for this quote)
$Pxml = "`'"+$StringWriter.ToString()+"`'"
#Write to database
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = [string]::Format($sqlquery, $tableName, $Pxml)
$SqlCmd.Connection = $SqlConnection
$SqlCmd.ExecuteScalar()
}
The XMl document has structure:
<xml>
<root>
...
</root>
<root>
...
</root>
</xml>
and the resultant strings are of form:
<root>
...
</root>