Skip to content

d3d11: initial 16 bit format support

Currently, on the Unity side texture format is chosen in C# Unity script by code such as

tex = Texture2D.CreateExternalTexture((int)i_videoWidth,
                    (int)i_videoHeight,
                    TextureFormat.RGBA32,
                    false,
                    true,
                    texptr);

According to https://docs.unity3d.com/6000.1/Documentation/ScriptReference/TextureFormat.html, the closest to DXGI_FORMAT_R16G16B16A16_UNORM would be https://docs.unity3d.com/6000.1/Documentation/ScriptReference/TextureFormat.RGBAHalf.html.

Since the user can create the texture in their code, with our current plugin architecture, there is no straightforward way for us to figure out which format the user picked for their texture creation.

We could provide a helper in VLC Unity which could configure the native plugin under the hood.

Merge request reports

Loading