2

I need to access an HLSL shader as an ID3DBlob, and I am unsure how to do this in Unity.

The example I am working from

I am working with a native API in Unity, which I have imported using a .dll file using the technique described here, per this answer. This API includes the following C++ call in the examples (you can download from here):

varjo_MRD3D11ConfigureShader(m_session, m_d3d11Device.Get(), varjo_ShaderType_VideoPostProcess, &shaderConfig,
            (shaderBlob ? static_cast<char*>(shaderBlob->GetBufferPointer()) : nullptr),
            shaderBlob ? static_cast<int32_t>(shaderBlob->GetBufferSize()) : 0);

This is from the "Experimental API Video post-processing API" some documentation here (you have to download the "Varjo Experimental SDK for Custom Engines" and then navigate to the "/docs/_varjo__mr__experimental_8h.html#a01df1512b747fe2e8c75007ad5998d93" file for proper documentation). A screenshot of the full documentation follows:

enter image description here

I have determined that the shaderData parameter of type const char * is what contains the shader data being sent to the headset. The example code in the beginning of this question (taken from the linked developer-provided examples) passes in the data from the variable shaderBlob, which is of type ID3DBlob and is documented as part of DirectX here.

In the developer-provided C++ example, the HLSL file is read in and converted to this ID3DBlob data structure by calling another DirectX function (documented here), D3DCompile:

LOG_INFO("Loading shader source: HLSL Source: %s", shaderFilename.c_str());
auto shaderData = loadFile(shaderFilename);
if (!shaderData.empty()) {
    ComPtr<ID3DBlob> errorMsgs;
    
    // The function of interest - comment my own...
    HRESULT hr = D3DCompile(shaderData.data(), shaderData.size(), shaderFilename.c_str(), nullptr, nullptr, c_shaderEntrypoint, c_shaderTarget, 0,
        0, &shaderBlob, &errorMsgs);
    // --------------------------------------------

    if (FAILED(hr) && errorMsgs) {
        std::string err(reinterpret_cast<char*>(errorMsgs->GetBufferPointer()), errorMsgs->GetBufferSize());
        LOG_ERROR("Compiling post process shader failed: %s", err.c_str());
        return false;
    }
} else {
    return false;
}

Microsoft also provides some additional explanation of compiling shaders using DirectX here.

These developer-provided examples are all a stand-alone C++ app.

What I am trying to do

Using the Unity documentation and the DllImport attribute documentation and the passing structs tutorial, I have set up the API call from my C# script as follows:

public class NativeSDKWrapper : MonoBehaviour
{
    #region Constants for DLL
    // ...

    /// <summary>
    /// See: http://[your localhost here]/_varjo__types__mr__experimental_8h.html
    /// </summary>
    private const Int64 VARJO_SHADER_TYPE_VIDEO_POST_PROCESS = 1;

    // ...
    #endregion

    #region Varjo Functions from DLL
    // ...

    [DllImport("VarjoLib")]
    private static extern void varjo_MRD3D11ConfigureShader(
        IntPtr session,
        IntPtr device,
        Int64 shaderType,
        ref VarjoShaderConfig shaderConfig, // I have defined the appropriate structs for this
        [MarshalAs(UnmanagedType.LPStr)]string data, // See: https://stackoverflow.com/questions/39987435/pass-char-to-c-dll-from-c-sharp-string
        int shaderSize);

    // ...
    #endregion

    public void LoadShaderFromFile(ShaderParams shaderParams)
    {
        // Shader asset accessed here, and the Varjo lock acquired,
        //     and then shader config set up.
        // ...
        
        // Send the shader to the Varjo headset.
        varjo_MRD3D11ConfigureShader(
            this._session,
            , // Still need to figure out getting the device.
            NativeSDKWrapper.VARJO_SHADER_TYPE_VIDEO_POST_PROCESS,
            ref shaderConfig,
            , // WHAT GOES HERE??? (this is the string type translated from ID3DBlob.
              // The blob's (string's?) size will go here
        );

        // Varjo unlock here...
        // ...
    }
}

How do I get from a shader asset in my Unity Assets folder to the input string needed for that function call (so as type [MarshalAs(UnmanagedType.LPStr)]string data)? I went with the string to correspond with the screenshot above documenting that input as a const char *. Again, in the developer-provided C++ example this input is of type ID3DBlob.

Does Unity expose an equivalent to that DirectX shader compiling through D3DCompile? Does that question even make sense? Does Unity handle that compiling for me, and I can just grab the needed data as a string in the needed format from somewhere to put into this native API call?

I have also read through some of the Unity DirectX documentation but have not found it helpful.

0

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.