I have a PowerShell script that's reading data from a local SQL Server Express 2014 database and writing it to a CSV file. It works fine, but it's using up almost the entire pool of available memory from SQL Server, and not releasing it when the script is finished running. Even calling GC.Collect() doesn't release the memory; I have to restart the SQL Server service to get it back.
Here's the relevant portion of the script, where I've determined the leak is occurring:
try {
$sqlConn = New-Object -TypeName System.Data.SqlClient.SqlConnection($sqlConnString);
$sqlConn.Open();
$sqlCmd = New-Object -TypeName System.Data.SqlClient.SqlCommand;
$sqlCmd.Connection = $sqlConn;
$sqlCmd.CommandType = [System.Data.CommandType]::Text;
$sqlCmd.CommandText = "SELECT * FROM $exportReadings";
#Write contents to a CSV file
$line = New-Object -TypeName System.Text.StringBuilder;
$out = New-Object -TypeName System.IO.StreamWriter -ArgumentList $csvFile, $false;
$out.WriteLine("HEADER_ROW");
$reader = $sqlCmd.ExecuteReader();
$cols = $reader.VisibleFieldCount;
while ($reader.Read()) {
$line.Clear();
for ($i = 0; $i -lt $cols; $i++) {
$val = $reader.GetValue($i);
if ($val -like "*$delimiter*") {
$line.Append('"').Append($val).Append('"');
}
else {
$line.Append($val);
}
if ($i -ne ($cols - 1)) {
$line.Append($delimiter);
}
}
$out.WriteLine($line.ToString());
}
}
catch {
throw;
}
finally {
if ($reader) { $reader.Dispose(); }
if ($out) {
$out.Close();
$out.Dispose();
}
$sqlCmd.Dispose();
$sqlConn.Dispose();
}
If I query memory usage on SQL Server before running the script (I'm using SELECT * FROM sys.dm_exec_query_resource_semaphores), I get `available_memory_kb) of about 750K. Afterwards, it's around 150K, even once the script has terminated.
I've tried to release every resource I can, but I must be missing something. Any ideas?