1

I am expierencing an issue with a powershell script which doesn't work whereas dotnet application with the same commands works. I believe the problem is related to encoding, I think in powershell the encoding is different which causes some non-transparent problems with a byte-object-marker or decoding/encoding itself.

A little bit background, I am using proxy command in ssh to automate some steps of setting up a tunnel. I have a working prototype that is using a small dotnet based executable that forwards stdin and stdout to a tcp socket. However preferable I'd like to use vanilla powershell script if possible.

The information that should be forwarded over stdin/stdout should be preserved as-is without any encoding conversion, and when I use a dotnet application it fully works, but when I use the powershell script with equavelent code ssh basically reports sometimes the packet sizes are different / unexpected big.

Anybody have any idea what type of call I could do to reset powershell to not encode stdin/stdout but map the data 1:1?

    Host XXYYZZWW
        User server
        HostName 127.0.0.1
        Port 2222
        StrictHostKeyChecking no
        NumberOfPasswordPrompts 0
        PasswordAuthentication no
        BatchMode yes
        IdentityFile ~/.ssh/id_rsa
        ProxyCommand powershell -File D:/Workspaces/liquid/maintenancw/rustdesk-ssh.ps1 -LocalPort 2222 -RemotePort 22 -RemoteHost 127.0.0.1 -RustdeskId XXYYZZWW
param ($RustdeskId, $IdentityFile, $User, $LocalPort, $RemotePort, $RemoteHost, $RemoteUser)

$def = @"
using System.Net;
using System.IO;
using System.Diagnostics;
using System.Threading;
using System;

public class MyClass {
  public void Run() {
    var tcpClient = new System.Net.Sockets.TcpClient("localhost", 2222);
    tcpClient.NoDelay = true;
    var tcpStream = tcpClient.GetStream();
    var stdin = System.Console.OpenStandardInput();
    var stdout = System.Console.OpenStandardOutput();
    System.Threading.Tasks.Task.WaitAll(stdin.CopyToAsync(tcpStream, 1024), tcpStream.CopyToAsync(stdout, 1024));
  }
}
"@

Add-Type -TypeDefinition $def
$obj = New-Object MyClass
$obj.Run()
3
  • PS is written in Net Library so I do not think it is an encoding error. You can save results to a file and then compare the original data and final data with Beyond Compare to see the differences. Most likely the issue is waiting for all the data to be received. To send binary data reliably you need to add a byte count at beginning for message and then read until all the bytes are received. Commented Jul 14, 2024 at 13:25
  • Unfortunately this is not practical advice, I'm not the author of the (binary-) protocol in question, and I've already played around to an extend as resetting the send buffer, and receive buffer settings as they forward bytes when coming in, without delay. p.s. powershell has this lovely feature @"" which allows you to specify csharp code directly, so I've even been able to transplant my working csharp code without modifications in powershell to see it not working. Commented Jul 29, 2024 at 6:34
  • Use single quotes instead of double quotes. You have double quotes inside the string which is probably causing the error. Commented Jul 29, 2024 at 8:58

1 Answer 1

0

What you're trying to do can only be implemented in a PowerShell script if:

  • Either:

    • If you provide stdin input to it from outside PowerShell, such as calling it from cmd.exe (on Windows) or bash (in Unix-like environments) via the PowerShell CLI (powershell.exe for Windows PowerShell, pwsh for PowerShell (Core) 7)

    • From inside PowerShell, in Windows PowerShell and up to PowerShell 7.3.x, you won't be able to supply raw byte data to your invoked-via-a-child-process script (see below for PowerShell 7.4+).

      • However, if you don't mind the inefficiency, you can use nested child processes, by delegating to the platform-native shell to take advantage of its raw byte handling; e.g., via cmd.exe, assuming that some.exe provides the input to your script:
        cmd /c 'some.exe | powershell -file yourscript.ps1'
  • Or: In PowerShell 7.4+ only, if you:

    • (a) call your script via a child process, using pwsh

      • Note: In Unix-like environments you can ensure this implicitly if you make your script an executable shell script based on a shebang line, and give it a name without the .ps1 extension.
    • and (b) if the command providing the input data is either an external (native) program, or, if you use a PowerShell command, if you make that command provide the data in the form of a [byte[] array (or a stream of [byte] instances).

See below for background information.


Support for binary data (raw bytes) in the PowerShell pipeline and via the PowerShell CLI's stdin and stdout streams:

  • Via PowerShell's CLI (powershell.exe for Windows PowerShell, pwsh for PowerShell (Core) 7), i.e. when PowerShell is called from the outside (even in PowerShell 7.4.x):

    • Stdin input to the CLI, as reflected in the automatic $input variable, is invariably interpreted as text.

      • On Windows, it is decoded into .NET strings based on the active code page of the current console / terminal, as reflected in the output from chcp.com.
      • In Unix-like environments, it is decoded as UTF-8.
    • Stdout output from the CLI is invariably converted to text, using plain-text output by default; via -OutputFormat xml (-of xml), you can request CLIXML output, an XML-based serialization format specific to PowerShell that preserves the original data types within limitations.

      • On Windows, the .NET strings comprising the output are encoded based on the encoding stored in [Console]::OutputEncoding, which defaults to the active code page of the current console / terminal, as reflected in the output from chcp.com (but can be changed in-session).
      • In Unix-like environments, it is encoded as UTF-8.
    • However, you can work with binary data (raw bytes), if you use .NET APIs directly, as shown in your question ([System.Console]::OpenStandardInput() and [System.Console]::OpenStandardOutput()).

  • Inside a PowerShell session:

    • Windows PowerShell and PowerShell (Core) 7 up to version 7.3.x, invariably only "speak text" when sending data to an external (native) program('s stdin stream) and when receiving data from it(s stdout stream).

      • Data sent (piped to) to an external program is encoded as text, based on the $OutputEncoding preference variable

      • Data received is decoded based on the encoding stored in [Console]::OutputEncoding

      • Ensuring that raw bytes are passed between external programs therefore requires a workaround, which on Windows means calling via cmd.exe /c: see this answer.

    • In PowerShell 7 from version 7.4 on, raw bytes streams now are supported, with limitations:

      • You may send (pipe) [byte[]] data to an external program, which causes the latter to receive the raw bytes contained in this byte array (you may also pipe the data byte by byte ([byte]), but that performs worse).

      • You may receive raw byte data from an external program if you redirect it to a file using >, i.e. via a redirection - there's no direct way to receive the data as a [byte[]] array in memory.

      • Additionally, the pipeline now implicitly acts as a raw byte conduit between external programs. That is, if data is piped from one external program to another, the latter receives the raw bytes.

Sign up to request clarification or add additional context in comments.

6 Comments

In my scenario, the stdin/stdout input the powershell executable is called by proxy command of ssh with redirected stdin/stdout. Basically ssh generates a binary network protocol, and pipes it to stdin/stdout of the command designated by proxy command. I believe that should qualify the first rule i.e. provide stdin/stdout from an outside proccess, or did I misunderstood you?
Yes, the CLI rules then apply to you, @LawrenceKok - I've updated the bottom section to make that clearer.
If I understand correctly, by using the .net api's directly, you'd bypass traditional encoding settings. I've done this very deliberately, and as you also suggest, by using the .net api's directly, you can work with binary data. Do you have an explanation in what ways my powershell and csharp script would functionally be different? My cshap solution invokes the same api's and works perfectly, however the powershell script falls apart during key exchange protocol?
@LawrenceKok, what you're saying sounds correct. The only explanation I can think of - but it sounds like that's not actually the case - is if you provided the binary input data from inside a PowerShell session in WinPS and PowerShell 7.3- or if you invoked your .ps1 script directly from a PowerShell session, not via a chid process.
yes you're correct, the binary input data is provided from ssh.exe that pipes stdin and stdout, of the powershell executable. The stdin/stdout of the process are then used as an alternative of socket send/receive buffers. I even specified the exact same csharp code with the @"" feature in powershell without any modification and I assume powershell specific modification. The only differences I have been able to deduce is that csharp exectuable is based on dotnet 6.0, whereas powershell I think is standard .net framework, and output encoding classes seem different.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.