21

For the following code

$sql = "select ..... for xml path('row')" # returns a long xml file
$x = Invoke-SqlCmd -Server xxxx $sql
$r = $x[0].ItemArray[0]

The returned $r has a truncated xml string. How to make sure the full long string is returned?

1
  • Consider whether you really need to use the PowerShell SQL Server module's Invoke-Sqlcmd. I believe it is merely a wrapper around the SQLCMD.exe utility. I sometimes prefer SQLCMD's simplistic output over the PowerShell object. Just depends on what you are trying to do. Commented Sep 24, 2024 at 19:37

2 Answers 2

34

That cmdlet has a default max character length, defaults to 4,000 characters (see Search for -MaxCharLength) for XML or char data types.

Try the following:

$x = Invoke-SqlCmd -Server xxxx $sql -MaxCharLength <some large enough number>
Sign up to request clarification or add additional context in comments.

7 Comments

Actually I found the result is broken into multiple rows.
@dc7a9163d9 I found that to be true as well from some quick tests I've done. However, it should be easy enough to rebuild your xml document. Just put humpty dumpty together again. ;) That said, I have also found that if you find that your query returns more than 4000 array elements for a single column of char data (1024 for binary; use -MaxBinaryLength), then you will need the command above to make sure you get the whole document.
Admittedly though my tests were just with binary data I had already in a DB somewhere. I didn't do any tests on xml or other char data. I don't know how powershell would chunk them up. For me it looked like it was chopping them up by bytes. Perhaps for a xml doc it does it per line. Would have to test it.
This answer solved the problem for me, without the data being split into multiple rows. I successfully read a string of over 80,000 bytes.
@CrazyPyro This is not a limitation or issue with PowerShell. This would all be up to SQL Server team and the module they wrote. This is no different than having a similar issue in Python or other languages that support 3rd party modules.
|
4

Many times this solves the case:

$x = Invoke-SqlCmd -Server xxxx $sql -MaxCharLength <some large enough number>

However, there are some cases where it doesn't work:

  • Somewhere between 80,000 and 3,500,000 chars this solution appears to break down.
  • The result I got was scrambled: Inner XML broke outer XML, so clearly at least our version has some defects in it as well.

You could try couple of solutions:

  • Limit content to x chars, such as 80,000 and don't try to export anything longer than that. I didn't test if this would solve the defect case also, so if someone else has this problem, please comment if this helps or not.
  • I exported everything as CSV, broke the inner XML, created temporary XML result and finally fixed again the inner XML back. This solution worked. Option "-raw" with file reading was necessary when handling files almost one GB size to improve performance.

1 Comment

I had a 4000 char limit and I couldnt fix it by adding any random number. Changing to 80000 resolved my char limit issue. Thanks!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.