Draft: display: get a "decoder device" from the display module
In some cases, software decoding, the display module is not provided a decoder device. In many cases it relies on an internal one to operate as if the device was created with the default user "device parameters" anyway.
It is necessary to share this "decoder device" with filter so they can use the same device to upload data on that device (CPU to GPU converter, for example).
This is a proposal on a way to do this, with an implementation for D3D11.
It is one step towards the cleaning of the HW & SW output pools discussed in #26825 (closed)
* decoder device is mainly used to manage the GPU RAM right now.
* to reduce to confusion, it should be renamed to not limit the usage to decoding.
* proposition is to rename "decoder device" to "graphics context"
* in case of SW decoding, the "graphics context" will be created at the first call of `update_format`
* A "surface allocator", that is able to provide RAM CPU video contexts, should also be implemented.
* These changes should also imply a creation of a new decoder device "type".
It doesn't rename the "decoder device". It doesn't split the type from the original type either, although it may be needed in case we want to do decoding on one GPU and rendering in another. The creation of the 2 device may be split. As seen in the D3D11 example, creating a decoder device of the proper type already requires forcing a parent variable. We may also set a different device value than the decoder.
There's also no pool handling through this device.
The device is passed through a video context as we may need to know exactly what kind of video context the last decoder>filter>converter should send to the display. The device is not sufficient to know the details of the format to send (for example the DXGI_FORMAT in D3D11). This is especially true if we want to avoid having one opaque chroma per device+chroma possibilities (over a hundred in DXGI).