I searched some but I don't find a solution to play h265 Files with hvc1 on VLC. The only solution so far was that people say to switch to another Player.
A Friend of my made some Videos with his Samsung S24 on a Concert and I wanted to show these Videos with VLC. I Updated the K-Lite Codec Pack to be sure the Codecs are on the newest version, but the VLC only shows black screen and plays the audio.
Is there a solution to get these files played by VLC?
d3d11va debug: Trying to use 'HEVC Main profile' as inputd3d11va debug: NV12 output is supported for decoder HEVC Main profile.d3d11va error: No decoder configuration possible for NV12 7680x4352d3d11va debug: NV12 output is supported for decoder HEVC Main profile.d3d11va error: No decoder configuration possible for NV12 7680x4352d3d11va debug: I420_OPAQUE output is supported for decoder HEVC Main profile.d3d11va error: No decoder configuration possible for I420_OPAQUE 7680x4352d3d11va debug: Output format from picture source not supported.d3d11va warning: Unsupported bitdepth 8 for HEVC Main 10 profile
The file works fine for me on my machine in 3.0 and 4.0.
d3d11va generic debug: Trying to use 'HEVC Main profile (Intel)' as inputd3d11va generic debug: format NV12 is supported for outputd3d11va generic debug: format I420_OPAQUE is supported for outputd3d11va generic debug: favor decoder format VA_NV12d3d11va generic debug: NV12 output is supported for decoder HEVC Main profile (Intel).d3d11va generic debug: Using output format NV12 for decoder HEVC Main profile (Intel)d3d11va generic debug: va_pool_SetupDecoder id 173 7680x4320 count: 28d3d11va generic debug: ID3D11VideoDecoderOutputView succeed with 28 surfaces (7680x4352)d3d11va generic debug: we got 6 decoder configurationsd3d11va generic debug: configuration[0] ConfigBitstreamRaw 1d3d11va generic debug: configuration[1] ConfigBitstreamRaw 2d3d11va generic debug: configuration[2] ConfigBitstreamRaw 1d3d11va generic debug: configuration[3] ConfigBitstreamRaw 1d3d11va generic debug: configuration[4] ConfigBitstreamRaw 2d3d11va generic debug: configuration[5] ConfigBitstreamRaw 2
One major difference (apart from the NVIDIA brand) is that it's using a lot less surfaces (28 vs 48), meaning a lot less memory. You may try to lower the number of threads to 1 to see if it helps (avcodec-threads=1).
Actually it's very likely about the resolution. There is no memory involved when checking for the decoder capabilities (although it may say it doesn't support the decoding if it knows it doesn't have enough memory to do so):
D3D11_VIDEO_DECODER_DESC decoderDesc; ZeroMemory(&decoderDesc, sizeof(decoderDesc)); decoderDesc.Guid = *input; decoderDesc.SampleWidth = surface_width; decoderDesc.SampleHeight = surface_height; decoderDesc.OutputFormat = processorInput[idx]; UINT cfg_count = 0; hr = ID3D11VideoDevice_GetVideoDecoderConfigCount( dx_sys->d3ddec, &decoderDesc, &cfg_count ); if (FAILED(hr)) { msg_Err( va, "Failed to get configuration for decoder %s. (hr=0x%lX)", psz_decoder_name, hr ); continue; } if (cfg_count == 0) { msg_Err( va, "No decoder configuration possible for %s %dx%d", DxgiFormatToStr(decoderDesc.OutputFormat), decoderDesc.SampleWidth, decoderDesc.SampleHeight ); continue; }
There's a list of constraints for the NVIDIA decoders (should be the same for DXVA). RTX 2060 seems to be a Turing GPU that should decode HEVC 8192x8192.
You can check the DXVA capabilities of the GPU with DXVAChecker.
Interresting. I suspect it's because it doesn't have enough memory left by the time it needs to allocate internal buffers for decoding. VLC 4 uses a lot less memory than VLC 3 and allocates it in a different order.
Can you check VLC 3 (going back to D3D11VA and Direct3d11 output) and set the avcodec threads to 1 ? advanced options > Input / Codecs > Video Codecs > FFmpeg > Threads (just below the Hardware decoding option)
i think i found the problem. the screen is black on both vlc3 and vlc4 32 bits while it works fine on both VLC3 and VLC4 64 bits. on 32 bits maybe it hit the limit of virtual address space 2GB (or 3,7 GB if vlc is large address aware)
Ahh ok, i don't see that when I download VLC on the VLC Site the Standard is 32-bit. I don't mentioned that there is a scroll down Menue where i must aktively choose 64-bits.
After I installed the 64-bits Version of VLC anything works fine.
I think there must be a better seeable solution for the other Versions to Download.
I only clicked on Download without seeing that there a other Versions of it. ;-)
In our case we test NV12 and I420_OPAQUE with the proper dimensions (with 128 alignment on width and height) and both say there is no available configuration.
I wish DXVAChecker was open source so we can tell how it's testing for 8K support.
Another option could be that it works in DXVA2 but not D3D11VA.
d3d11va generic debug: Trying to use 'HEVC Main profile' as inputd3d11va generic debug: format NV12 is supported for outputd3d11va generic debug: format I420_OPAQUE is supported for outputd3d11va generic debug: NV12 output is supported for decoder HEVC Main profile.d3d11va generic debug: Using output format NV12 for decoder HEVC Main profiled3d11va generic debug: va_pool_SetupDecoder id 173 7680x4320 count: 28d3d11va generic warning: not enough decoding slices in the texture (6/28)d3d11va generic debug: ID3D11VideoDecoderOutputView succeed with 28 surfaces (7680x4352)d3d11va generic debug: we got 2 decoder configurationsd3d11va generic debug: configuration[0] ConfigBitstreamRaw 1d3d11va generic debug: configuration[1] ConfigBitstreamRaw 1d3d11va generic debug: DxCreateDecoderSurfaces succeedmain generic debug: using hw decoder module "d3d11va"avcodec decoder: Using D3D11VA (NVIDIA GeForce RTX 3070, vendor 10de(NVIDIA), device 2484, revision a1) for hardware decoding
François Cartegniechanged title from Problem to Play h265 hvc1 Files to D3D11VA GetVideoDecoderConfigCount Problem with 8K HEVC output as NV12
changed title from Problem to Play h265 hvc1 Files to D3D11VA GetVideoDecoderConfigCount Problem with 8K HEVC output as NV12
François Cartegniechanged title from D3D11VA GetVideoDecoderConfigCount Problem with 8K HEVC output as NV12 to D3D11VA GetVideoDecoderConfigCount Problem with 8K HEVC output as NV12 on 32bits
changed title from D3D11VA GetVideoDecoderConfigCount Problem with 8K HEVC output as NV12 to D3D11VA GetVideoDecoderConfigCount Problem with 8K HEVC output as NV12 on 32bits