Message ID | 20210426173852.484368-2-harry.wentland@amd.com (mailing list archive) |
---|---|
State | New, archived |
Headers | show |
Series | A drm_plane API to support HDR planes | expand |
On Mon, Apr 26, 2021 at 01:38:50PM -0400, Harry Wentland wrote: > From: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> > > Add the following color encodings > - RGB versions for BT601, BT709, BT2020 > - DCI-P3: Used for digital movies > > Signed-off-by: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> > Signed-off-by: Harry Wentland <harry.wentland@amd.com> > --- > drivers/gpu/drm/drm_color_mgmt.c | 4 ++++ > include/drm/drm_color_mgmt.h | 4 ++++ > 2 files changed, 8 insertions(+) > > diff --git a/drivers/gpu/drm/drm_color_mgmt.c b/drivers/gpu/drm/drm_color_mgmt.c > index bb14f488c8f6..a183ebae2941 100644 > --- a/drivers/gpu/drm/drm_color_mgmt.c > +++ b/drivers/gpu/drm/drm_color_mgmt.c > @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] = { > [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", > [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", > [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", > + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", > + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", > + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", > + [DRM_COLOR_P3] = "DCI-P3", These are a totally different thing than the YCbCr stuff. The YCbCr stuff just specifies the YCbCr<->RGB converison matrix, whereas these are I guess supposed to specify the primaries/whitepoint? But without specifying what we're converting *to* these mean absolutely nothing. Ie. I don't think they belong in this property. The previous proposals around this topic have suggested a new property to specify a conversion matrix either explicitly, or via a separate enum (which would specify both the src and dst colorspaces). I've always argued the enum approach is needed anyway since not all hardware has a programmable matrix for this. But a fully programmable matrix could be nice for tone mapping purposes/etc, so we may want to make sure both are possible. As for the transfer func, the proposals so far have mostly just been to expose a programmable degamma/gamma LUTs for each plane. But considering how poor the current gamma uapi is we've thrown around some ideas how to allow the kernel to properly expose the hw capabilities. This is one of those ideas: https://lists.freedesktop.org/archives/dri-devel/2019-April/212886.html I think that basic idea could be also be extended to allow fixed curves in case the hw doesn't have a fully programmable LUT. But dunno if that's relevant for your hw.
On 2021-04-26 2:07 p.m., Ville Syrjälä wrote: > On Mon, Apr 26, 2021 at 01:38:50PM -0400, Harry Wentland wrote: >> From: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> >> >> Add the following color encodings >> - RGB versions for BT601, BT709, BT2020 >> - DCI-P3: Used for digital movies >> >> Signed-off-by: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> >> Signed-off-by: Harry Wentland <harry.wentland@amd.com> >> --- >> drivers/gpu/drm/drm_color_mgmt.c | 4 ++++ >> include/drm/drm_color_mgmt.h | 4 ++++ >> 2 files changed, 8 insertions(+) >> >> diff --git a/drivers/gpu/drm/drm_color_mgmt.c b/drivers/gpu/drm/drm_color_mgmt.c >> index bb14f488c8f6..a183ebae2941 100644 >> --- a/drivers/gpu/drm/drm_color_mgmt.c >> +++ b/drivers/gpu/drm/drm_color_mgmt.c >> @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] = { >> [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", >> [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", >> [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", >> + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", >> + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", >> + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", >> + [DRM_COLOR_P3] = "DCI-P3", > > These are a totally different thing than the YCbCr stuff. > The YCbCr stuff just specifies the YCbCr<->RGB converison matrix, > whereas these are I guess supposed to specify the primaries/whitepoint? > But without specifying what we're converting *to* these mean absolutely > nothing. Ie. I don't think they belong in this property. > If this is the intention I don't see it documented. I might have overlooked something but do we have a way to explicitly specify today what *to* format the YCbCr color encodings convert into? Would that be a combination of the output color encoding specified via colorspace_property and the color space encoded in the primaries and whitepoint of the hdr_output_metadata? Fundamentally I don't see how the use of this property differs, whether you translate from YCbCr or from RGB. In either case you're converting from the defined input color space and pixel format to an output color space and pixel format. > The previous proposals around this topic have suggested a new > property to specify a conversion matrix either explicitly, or > via a separate enum (which would specify both the src and dst > colorspaces). I've always argued the enum approach is needed > anyway since not all hardware has a programmable matrix for > this. But a fully programmable matrix could be nice for tone > mapping purposes/etc, so we may want to make sure both are > possible. > > As for the transfer func, the proposals so far have mostly just > been to expose a programmable degamma/gamma LUTs for each plane. > But considering how poor the current gamma uapi is we've thrown > around some ideas how to allow the kernel to properly expose the > hw capabilities. This is one of those ideas: > https://lists.freedesktop.org/archives/dri-devel/2019-April/212886.html>> I think that basic idea could be also be extended to allow fixed > curves in case the hw doesn't have a fully programmable LUT. But > dunno if that's relevant for your hw. > The problem with exposing gamma, whether per-plane or per-crtc, is that it is hard to define an API that works for all the HW out there. The capabilities for different HW differ a lot, not just between vendors but also between generations of a vendor's HW. Another reason I'm proposing to define the color space (and gamma) of a plane is to make this explicit. Up until the color space and gamma of a plane or framebuffer are not well defined, which leads to drivers assuming the color space and gamma of a buffer (for blending and other purposes) and might lead to sub-optimal outcomes. Harry
On Mon, Apr 26, 2021 at 02:56:26PM -0400, Harry Wentland wrote: > On 2021-04-26 2:07 p.m., Ville Syrjälä wrote: > > On Mon, Apr 26, 2021 at 01:38:50PM -0400, Harry Wentland wrote: > >> From: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> > >> > >> Add the following color encodings > >> - RGB versions for BT601, BT709, BT2020 > >> - DCI-P3: Used for digital movies > >> > >> Signed-off-by: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> > >> Signed-off-by: Harry Wentland <harry.wentland@amd.com> > >> --- > >> drivers/gpu/drm/drm_color_mgmt.c | 4 ++++ > >> include/drm/drm_color_mgmt.h | 4 ++++ > >> 2 files changed, 8 insertions(+) > >> > >> diff --git a/drivers/gpu/drm/drm_color_mgmt.c b/drivers/gpu/drm/drm_color_mgmt.c > >> index bb14f488c8f6..a183ebae2941 100644 > >> --- a/drivers/gpu/drm/drm_color_mgmt.c > >> +++ b/drivers/gpu/drm/drm_color_mgmt.c > >> @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] = { > >> [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", > >> [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", > >> [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", > >> + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", > >> + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", > >> + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", > >> + [DRM_COLOR_P3] = "DCI-P3", > > > > These are a totally different thing than the YCbCr stuff. > > The YCbCr stuff just specifies the YCbCr<->RGB converison matrix, > > whereas these are I guess supposed to specify the primaries/whitepoint? > > But without specifying what we're converting *to* these mean absolutely > > nothing. Ie. I don't think they belong in this property. > > > > If this is the intention I don't see it documented. > > I might have overlooked something but do we have a way to explicitly > specify today what *to* format the YCbCr color encodings convert into? These just specific which YCbCr<->RGB matrix to use as specificed in the relevant standards. The primaries/whitepoint/etc. don't change at all. > Would that be a combination of the output color encoding specified via > colorspace_property and the color space encoded in the primaries and > whitepoint of the hdr_output_metadata? Those propertis only affect the infoframes. They don't apply any color processing to the data. > > Fundamentally I don't see how the use of this property differs, whether > you translate from YCbCr or from RGB. In either case you're converting > from the defined input color space and pixel format to an output color > space and pixel format. The gamut does not change when you do YCbCr<->RGB conversion. > > > The previous proposals around this topic have suggested a new > > property to specify a conversion matrix either explicitly, or > > via a separate enum (which would specify both the src and dst > > colorspaces). I've always argued the enum approach is needed > > anyway since not all hardware has a programmable matrix for > > this. But a fully programmable matrix could be nice for tone > > mapping purposes/etc, so we may want to make sure both are > > possible. > > > > As for the transfer func, the proposals so far have mostly just > > been to expose a programmable degamma/gamma LUTs for each plane. > > But considering how poor the current gamma uapi is we've thrown > > around some ideas how to allow the kernel to properly expose the > > hw capabilities. This is one of those ideas: > > https://lists.freedesktop.org/archives/dri-devel/2019-April/212886.html>> I think that basic idea could be also be extended to allow fixed > > curves in case the hw doesn't have a fully programmable LUT. But > > dunno if that's relevant for your hw. > > > > The problem with exposing gamma, whether per-plane or per-crtc, is that > it is hard to define an API that works for all the HW out there. The > capabilities for different HW differ a lot, not just between vendors but > also between generations of a vendor's HW. > > Another reason I'm proposing to define the color space (and gamma) of a > plane is to make this explicit. Up until the color space and gamma of a > plane or framebuffer are not well defined, which leads to drivers > assuming the color space and gamma of a buffer (for blending and other > purposes) and might lead to sub-optimal outcomes. The current state is that things just get passed through as is (apart from the crtc LUTs/CTM).
On Mon, 26 Apr 2021 22:08:55 +0300 Ville Syrjälä <ville.syrjala@linux.intel.com> wrote: > On Mon, Apr 26, 2021 at 02:56:26PM -0400, Harry Wentland wrote: > > On 2021-04-26 2:07 p.m., Ville Syrjälä wrote: > > > On Mon, Apr 26, 2021 at 01:38:50PM -0400, Harry Wentland wrote: > > >> From: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> > > >> > > >> Add the following color encodings > > >> - RGB versions for BT601, BT709, BT2020 > > >> - DCI-P3: Used for digital movies > > >> > > >> Signed-off-by: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> > > >> Signed-off-by: Harry Wentland <harry.wentland@amd.com> > > >> --- > > >> drivers/gpu/drm/drm_color_mgmt.c | 4 ++++ > > >> include/drm/drm_color_mgmt.h | 4 ++++ > > >> 2 files changed, 8 insertions(+) > > >> > > >> diff --git a/drivers/gpu/drm/drm_color_mgmt.c b/drivers/gpu/drm/drm_color_mgmt.c > > >> index bb14f488c8f6..a183ebae2941 100644 > > >> --- a/drivers/gpu/drm/drm_color_mgmt.c > > >> +++ b/drivers/gpu/drm/drm_color_mgmt.c > > >> @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] = { > > >> [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", > > >> [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", > > >> [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", > > >> + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", > > >> + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", > > >> + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", > > >> + [DRM_COLOR_P3] = "DCI-P3", > > > > > > These are a totally different thing than the YCbCr stuff. > > > The YCbCr stuff just specifies the YCbCr<->RGB converison matrix, > > > whereas these are I guess supposed to specify the primaries/whitepoint? > > > But without specifying what we're converting *to* these mean absolutely > > > nothing. Ie. I don't think they belong in this property. > > > > > > > If this is the intention I don't see it documented. > > > > I might have overlooked something but do we have a way to explicitly > > specify today what *to* format the YCbCr color encodings convert into? > > These just specific which YCbCr<->RGB matrix to use as specificed > in the relevant standards. The primaries/whitepoint/etc. don't > change at all. Ville is correct here. > > Would that be a combination of the output color encoding specified via > > colorspace_property and the color space encoded in the primaries and > > whitepoint of the hdr_output_metadata? Conversion between YCbCR and RGB is not a color space conversion in the sense of color spaces (chromaticity of primaries and white point). It is a color model conversion or a color encoding conversion more like. A benefit of YCbCr is that you can use less bandwidth to transmit the same image and people won't realise that you lost anything: chroma sub-sampling. Sub-sampling with RGB wouldn't work that well. It's a lossy compression technique, but different standards use different compression algorithms (well, matrices) to balance what gets lost. > Those propertis only affect the infoframes. They don't apply any > color processing to the data. Indeed. An example: You start with YUV video you want to display. That means you have YCbCr data using color space X and EOTF Foo. When you convert that to RGB, the RGB data still has color space X and EOTF Foo. Then you use the infoframe to tell your monitor that the data is in color space X and using EOTF Foo. At no point in that pipeline there is a color space transformation, until the data actually reaches the monitor which may do magic things to map color space X and EOTF Foo into what it can actually make as light. Or as with the traditional way, you don't care what color space or EOTF your video uses or your monitor has. You just hope they are close enough to look good enough that people don't see anything wrong. Close your eyes and sing a happy song. With HDR and WCG, that totally breaks down. > > Fundamentally I don't see how the use of this property differs, whether > > you translate from YCbCr or from RGB. In either case you're converting > > from the defined input color space and pixel format to an output color > > space and pixel format. > > The gamut does not change when you do YCbCr<->RGB conversion. Right. Neither does dynamic range. > > > The previous proposals around this topic have suggested a new > > > property to specify a conversion matrix either explicitly, or > > > via a separate enum (which would specify both the src and dst > > > colorspaces). I've always argued the enum approach is needed > > > anyway since not all hardware has a programmable matrix for > > > this. But a fully programmable matrix could be nice for tone > > > mapping purposes/etc, so we may want to make sure both are > > > possible. > > > > > > As for the transfer func, the proposals so far have mostly just > > > been to expose a programmable degamma/gamma LUTs for each plane. > > > But considering how poor the current gamma uapi is we've thrown > > > around some ideas how to allow the kernel to properly expose the > > > hw capabilities. This is one of those ideas: > > > https://lists.freedesktop.org/archives/dri-devel/2019-April/212886.html>> I think that basic idea could be also be extended to allow fixed > > > curves in case the hw doesn't have a fully programmable LUT. But > > > dunno if that's relevant for your hw. > > > > > > > The problem with exposing gamma, whether per-plane or per-crtc, is that > > it is hard to define an API that works for all the HW out there. The > > capabilities for different HW differ a lot, not just between vendors but > > also between generations of a vendor's HW. > > > > Another reason I'm proposing to define the color space (and gamma) of a > > plane is to make this explicit. Up until the color space and gamma of a > > plane or framebuffer are not well defined, which leads to drivers > > assuming the color space and gamma of a buffer (for blending and other > > purposes) and might lead to sub-optimal outcomes. > > The current state is that things just get passed through as is > (apart from the crtc LUTs/CTM). Right. I would claim that the kernel does not even want to know about color spaces or EOTFs. Instead, the kernel should offer userspace ways to program the hardware to do the color *transformations* it wants to do. The color_encoding KMS property is already like this: it defines the conversion matrix, not what the input or output are. Infoframes being sent to displays are a different thing. They just tell the monitor what kind of image data userspace has configured KMS to send it, but does not change what KMS actually does with pixels. Also, please, let's talk about EOTF and EOTF^-1 instead of gamma when appropriate. Electro-optical transfer function (EOTF) is very clear in what it means: it is the mapping from electrical values (the non-linear pixel values you are used to, good for storage and transmission) to optical values (values that are linear in light intensity, therefore good for things like blending and filtering). Gamma is kind of the same, but when you use it in sentences it easily becomes ambiguous. Like if you have "gamma corrected pixels", what does that mean. Are they electrical values or optical values or maybe electrical values with a different EOTF. What EOTF. However, in KMS "gamma LUT" is kind of standardised terminology, and it does not need to be an EOTF or inverse-EOTF. One can use a gamma LUT for EETF, mapping from one EOTF to another, e.g. from data with content EOTF to data with monitor EOTF. Thanks, pq
On 2021-04-26 20:56, Harry Wentland wrote: > On 2021-04-26 2:07 p.m., Ville Syrjälä wrote: >> On Mon, Apr 26, 2021 at 01:38:50PM -0400, Harry Wentland wrote: >>> From: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> >>> >>> Add the following color encodings >>> - RGB versions for BT601, BT709, BT2020 >>> - DCI-P3: Used for digital movies >>> >>> Signed-off-by: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> >>> Signed-off-by: Harry Wentland <harry.wentland@amd.com> >>> --- >>> drivers/gpu/drm/drm_color_mgmt.c | 4 ++++ >>> include/drm/drm_color_mgmt.h | 4 ++++ >>> 2 files changed, 8 insertions(+) >>> >>> diff --git a/drivers/gpu/drm/drm_color_mgmt.c >>> b/drivers/gpu/drm/drm_color_mgmt.c >>> index bb14f488c8f6..a183ebae2941 100644 >>> --- a/drivers/gpu/drm/drm_color_mgmt.c >>> +++ b/drivers/gpu/drm/drm_color_mgmt.c >>> @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] >>> = { >>> [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", >>> [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", >>> [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", >>> + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", >>> + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", >>> + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", >>> + [DRM_COLOR_P3] = "DCI-P3", >> >> These are a totally different thing than the YCbCr stuff. >> The YCbCr stuff just specifies the YCbCr<->RGB converison matrix, >> whereas these are I guess supposed to specify the >> primaries/whitepoint? >> But without specifying what we're converting *to* these mean >> absolutely >> nothing. Ie. I don't think they belong in this property. >> > > If this is the intention I don't see it documented. > > I might have overlooked something but do we have a way to explicitly > specify today what *to* format the YCbCr color encodings convert into? > Would that be a combination of the output color encoding specified via > colorspace_property and the color space encoded in the primaries and > whitepoint of the hdr_output_metadata? > > Fundamentally I don't see how the use of this property differs, > whether you translate from YCbCr or from RGB. In either case you're > converting from the defined input color space and pixel format to an > output color space and pixel format. > >> The previous proposals around this topic have suggested a new >> property to specify a conversion matrix either explicitly, or >> via a separate enum (which would specify both the src and dst >> colorspaces). I've always argued the enum approach is needed >> anyway since not all hardware has a programmable matrix for >> this. But a fully programmable matrix could be nice for tone >> mapping purposes/etc, so we may want to make sure both are >> possible. >> >> As for the transfer func, the proposals so far have mostly just >> been to expose a programmable degamma/gamma LUTs for each plane. >> But considering how poor the current gamma uapi is we've thrown >> around some ideas how to allow the kernel to properly expose the >> hw capabilities. This is one of those ideas: >> https://lists.freedesktop.org/archives/dri-devel/2019-April/212886.html>> >> I think that basic idea could be also be extended to allow fixed >> curves in case the hw doesn't have a fully programmable LUT. But >> dunno if that's relevant for your hw. >> > > The problem with exposing gamma, whether per-plane or per-crtc, is > that it is hard to define an API that works for all the HW out there. > The capabilities for different HW differ a lot, not just between > vendors but also between generations of a vendor's HW. Introducing another API if hardware is sufficiently different doesn't seem like the worst idea. At least it sounds a lot more tractable than teaching the kernel about all the different use cases, opinions and nuances that arise from color management. In the end generic user space must always be able to fall back to software so the worst case is that it won't be able to offload an operation if it doesn't know about a new API. > Another reason I'm proposing to define the color space (and gamma) of > a plane is to make this explicit. Up until the color space and gamma > of a plane or framebuffer are not well defined, which leads to drivers > assuming the color space and gamma of a buffer (for blending and other > purposes) and might lead to sub-optimal outcomes. Blending only is "correct" with linear light so that property of the color space is important. However, why does the kernel have to be involved here? As long as user space knows that for correct blending the data must represent linear light and knows when in the pipeline blending happens it can make sure that the data at that point in the pipeline contains linear light. What other purposes are there? In general I agree with the others that user space only wants a pipeline of transformations where the mechanism, the order and ideally the precision is defined. > Harry
On 2021-04-30 8:53 p.m., Sebastian Wick wrote: > On 2021-04-26 20:56, Harry Wentland wrote: >> On 2021-04-26 2:07 p.m., Ville Syrjälä wrote: >>> On Mon, Apr 26, 2021 at 01:38:50PM -0400, Harry Wentland wrote: >>>> From: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> >>>> >>>> Add the following color encodings >>>> - RGB versions for BT601, BT709, BT2020 >>>> - DCI-P3: Used for digital movies >>>> >>>> Signed-off-by: Bhawanpreet Lakha <Bhawanpreet.Lakha@amd.com> >>>> Signed-off-by: Harry Wentland <harry.wentland@amd.com> >>>> --- >>>> drivers/gpu/drm/drm_color_mgmt.c | 4 ++++ >>>> include/drm/drm_color_mgmt.h | 4 ++++ >>>> 2 files changed, 8 insertions(+) >>>> >>>> diff --git a/drivers/gpu/drm/drm_color_mgmt.c b/drivers/gpu/drm/drm_color_mgmt.c >>>> index bb14f488c8f6..a183ebae2941 100644 >>>> --- a/drivers/gpu/drm/drm_color_mgmt.c >>>> +++ b/drivers/gpu/drm/drm_color_mgmt.c >>>> @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] = { >>>> [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", >>>> [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", >>>> [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", >>>> + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", >>>> + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", >>>> + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", >>>> + [DRM_COLOR_P3] = "DCI-P3", >>> >>> These are a totally different thing than the YCbCr stuff. >>> The YCbCr stuff just specifies the YCbCr<->RGB converison matrix, >>> whereas these are I guess supposed to specify the primaries/whitepoint? >>> But without specifying what we're converting *to* these mean absolutely >>> nothing. Ie. I don't think they belong in this property. >>> >> >> If this is the intention I don't see it documented. >> >> I might have overlooked something but do we have a way to explicitly >> specify today what *to* format the YCbCr color encodings convert into? >> Would that be a combination of the output color encoding specified via >> colorspace_property and the color space encoded in the primaries and >> whitepoint of the hdr_output_metadata? >> >> Fundamentally I don't see how the use of this property differs, >> whether you translate from YCbCr or from RGB. In either case you're >> converting from the defined input color space and pixel format to an >> output color space and pixel format. >> >>> The previous proposals around this topic have suggested a new >>> property to specify a conversion matrix either explicitly, or >>> via a separate enum (which would specify both the src and dst >>> colorspaces). I've always argued the enum approach is needed >>> anyway since not all hardware has a programmable matrix for >>> this. But a fully programmable matrix could be nice for tone >>> mapping purposes/etc, so we may want to make sure both are >>> possible. >>> >>> As for the transfer func, the proposals so far have mostly just >>> been to expose a programmable degamma/gamma LUTs for each plane. >>> But considering how poor the current gamma uapi is we've thrown >>> around some ideas how to allow the kernel to properly expose the >>> hw capabilities. This is one of those ideas: >>> https://lists.freedesktop.org/archives/dri-devel/2019-April/212886.html I think that basic idea could be also be extended to allow fixed >>> curves in case the hw doesn't have a fully programmable LUT. But >>> dunno if that's relevant for your hw. >>> >> >> The problem with exposing gamma, whether per-plane or per-crtc, is >> that it is hard to define an API that works for all the HW out there. >> The capabilities for different HW differ a lot, not just between >> vendors but also between generations of a vendor's HW. > > Introducing another API if hardware is sufficiently different doesn't > seem like the worst idea. At least it sounds a lot more tractable than > teaching the kernel about all the different use cases, opinions and > nuances that arise from color management. > > In the end generic user space must always be able to fall back to > software so the worst case is that it won't be able to offload an > operation if it doesn't know about a new API. > >> Another reason I'm proposing to define the color space (and gamma) of >> a plane is to make this explicit. Up until the color space and gamma >> of a plane or framebuffer are not well defined, which leads to drivers >> assuming the color space and gamma of a buffer (for blending and other >> purposes) and might lead to sub-optimal outcomes. > > Blending only is "correct" with linear light so that property of the > color space is important. However, why does the kernel have to be > involved here? As long as user space knows that for correct blending the > data must represent linear light and knows when in the pipeline blending > happens it can make sure that the data at that point in the pipeline > contains linear light. > The only reason the kernel needs to be involved is to take full advantage of the available HW without requiring KMS clients to be aware of the difference in display HW. Harry > What other purposes are there? > > In general I agree with the others that user space only wants a pipeline > of transformations where the mechanism, the order and ideally the > precision is defined. > >> Harry > _______________________________________________ > amd-gfx mailing list > amd-gfx@lists.freedesktop.org > https://lists.freedesktop.org/mailman/listinfo/amd-gfx>
On Fri, 14 May 2021 17:04:51 -0400 Harry Wentland <harry.wentland@amd.com> wrote: > On 2021-04-30 8:53 p.m., Sebastian Wick wrote: > > On 2021-04-26 20:56, Harry Wentland wrote: ... > >> Another reason I'm proposing to define the color space (and gamma) of > >> a plane is to make this explicit. Up until the color space and gamma > >> of a plane or framebuffer are not well defined, which leads to drivers > >> assuming the color space and gamma of a buffer (for blending and other > >> purposes) and might lead to sub-optimal outcomes. > > > > Blending only is "correct" with linear light so that property of the > > color space is important. However, why does the kernel have to be > > involved here? As long as user space knows that for correct blending the > > data must represent linear light and knows when in the pipeline blending > > happens it can make sure that the data at that point in the pipeline > > contains linear light. > > > > The only reason the kernel needs to be involved is to take full advantage > of the available HW without requiring KMS clients to be aware of > the difference in display HW. Can you explain in more tangible examples, why you think so, please? Is it because hardware does not fit the KMS UAPI model of the abstract pixel pipeline? Or is it because you have fixed-function hardware elements that you can only make use of when userspace uses an enum-based UAPI? I would totally agree that the driver does not want to be analysing LUT entries to decipher if it could use a fixed-function element or not. It would introduce uncertainty in the UAPI. So fixed-function elements would need their own properties, but I don't know if that is feasible as generic UAPI or if it should be driver-specific (and so left unused by generic userspace). Thanks, pq
On 2021-05-17 4:34 a.m., Pekka Paalanen wrote: > On Fri, 14 May 2021 17:04:51 -0400 > Harry Wentland <harry.wentland@amd.com> wrote: > >> On 2021-04-30 8:53 p.m., Sebastian Wick wrote: >>> On 2021-04-26 20:56, Harry Wentland wrote: > > ... > >>>> Another reason I'm proposing to define the color space (and gamma) of >>>> a plane is to make this explicit. Up until the color space and gamma >>>> of a plane or framebuffer are not well defined, which leads to drivers >>>> assuming the color space and gamma of a buffer (for blending and other >>>> purposes) and might lead to sub-optimal outcomes. >>> >>> Blending only is "correct" with linear light so that property of the >>> color space is important. However, why does the kernel have to be >>> involved here? As long as user space knows that for correct blending the >>> data must represent linear light and knows when in the pipeline blending >>> happens it can make sure that the data at that point in the pipeline >>> contains linear light. >>> >> >> The only reason the kernel needs to be involved is to take full advantage >> of the available HW without requiring KMS clients to be aware of >> the difference in display HW. > > Can you explain in more tangible examples, why you think so, please? > > Is it because hardware does not fit the KMS UAPI model of the abstract > pixel pipeline? > I'd wager no HW is designed to meet KMS UAPI, rather KMS UAPI is designed to abstract HW. > Or is it because you have fixed-function hardware elements that you can > only make use of when userspace uses an enum-based UAPI? > One example is our degamma on our latest generation HW, where we have fixed-function "degamma" (rather de-EOTF): https://gitlab.freedesktop.org/agd5f/linux/-/blob/amd-staging-drm-next/drivers/gpu/drm/amd/display/dc/dcn30/dcn30_dpp.c#L166 > I would totally agree that the driver does not want to be analysing LUT > entries to decipher if it could use a fixed-function element or not. It > would introduce uncertainty in the UAPI. So fixed-function elements > would need their own properties, but I don't know if that is feasible > as generic UAPI or if it should be driver-specific (and so left unused > by generic userspace). > For the CRTC LUTs we actually do a linearity check to program the HW into bypass when the LUT is linear since the KMS LUT definition doesn't map well onto the LUT definition used by our HW and leads to rounding errors and failing IGT kms_color tests (if I remember this correctly). https://gitlab.freedesktop.org/agd5f/linux/-/blob/amd-staging-drm-next/drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm_color.c#L330 Hence the suggestion to define pre-defined TFs right at a KMS level for usecases where we can assume the display will tonemap the content. Harry > > Thanks, > pq >
On Tue, 18 May 2021 10:32:48 -0400 Harry Wentland <harry.wentland@amd.com> wrote: > On 2021-05-17 4:34 a.m., Pekka Paalanen wrote: > > On Fri, 14 May 2021 17:04:51 -0400 > > Harry Wentland <harry.wentland@amd.com> wrote: > > > >> On 2021-04-30 8:53 p.m., Sebastian Wick wrote: > >>> On 2021-04-26 20:56, Harry Wentland wrote: > > > > ... > > > >>>> Another reason I'm proposing to define the color space (and gamma) of > >>>> a plane is to make this explicit. Up until the color space and gamma > >>>> of a plane or framebuffer are not well defined, which leads to drivers > >>>> assuming the color space and gamma of a buffer (for blending and other > >>>> purposes) and might lead to sub-optimal outcomes. > >>> > >>> Blending only is "correct" with linear light so that property of the > >>> color space is important. However, why does the kernel have to be > >>> involved here? As long as user space knows that for correct blending the > >>> data must represent linear light and knows when in the pipeline blending > >>> happens it can make sure that the data at that point in the pipeline > >>> contains linear light. > >>> > >> > >> The only reason the kernel needs to be involved is to take full advantage > >> of the available HW without requiring KMS clients to be aware of > >> the difference in display HW. > > > > Can you explain in more tangible examples, why you think so, please? > > > > Is it because hardware does not fit the KMS UAPI model of the abstract > > pixel pipeline? > > > > I'd wager no HW is designed to meet KMS UAPI, rather KMS UAPI is designed > to abstract HW. Of course, but you are in big trouble in any case if there is a fundamental mismatch. You may have to declare that all existing KMS properties for this stuff will be mutually exclusive with your new properties, so that you can introduce a new generic abstract pipeline in KMS. By mutually exclusive I mean that a driver must advertise only one or the other set of properties and never both. If you want to support userspace that doesn't understand the alternative set, maybe you also need a drm client cap to switch to the alternative set per-drm-client. > > Or is it because you have fixed-function hardware elements that you can > > only make use of when userspace uses an enum-based UAPI? > > > > One example is our degamma on our latest generation HW, where we have > fixed-function "degamma" (rather de-EOTF): > > https://gitlab.freedesktop.org/agd5f/linux/-/blob/amd-staging-drm-next/drivers/gpu/drm/amd/display/dc/dcn30/dcn30_dpp.c#L166 Ok. > > I would totally agree that the driver does not want to be analysing LUT > > entries to decipher if it could use a fixed-function element or not. It > > would introduce uncertainty in the UAPI. So fixed-function elements > > would need their own properties, but I don't know if that is feasible > > as generic UAPI or if it should be driver-specific (and so left unused > > by generic userspace). > > > > > For the CRTC LUTs we actually do a linearity check to program the > HW into bypass when the LUT is linear since the KMS LUT definition > doesn't map well onto the LUT definition used by our HW and leads > to rounding errors and failing IGT kms_color tests (if I remember > this correctly). > > https://gitlab.freedesktop.org/agd5f/linux/-/blob/amd-staging-drm-next/drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm_color.c#L330 > > Hence the suggestion to define pre-defined TFs right at a KMS level > for usecases where we can assume the display will tonemap the > content. Right. Explaining this would have been a good introduction in your cover letter. Maybe you want to define new KMS properties that shall be mutually exclusive with the existing KMS GAMMA/CTM/DEGAMMA properties and clearly document them as such. Thanks, pq
diff --git a/drivers/gpu/drm/drm_color_mgmt.c b/drivers/gpu/drm/drm_color_mgmt.c index bb14f488c8f6..a183ebae2941 100644 --- a/drivers/gpu/drm/drm_color_mgmt.c +++ b/drivers/gpu/drm/drm_color_mgmt.c @@ -469,6 +469,10 @@ static const char * const color_encoding_name[] = { [DRM_COLOR_YCBCR_BT601] = "ITU-R BT.601 YCbCr", [DRM_COLOR_YCBCR_BT709] = "ITU-R BT.709 YCbCr", [DRM_COLOR_YCBCR_BT2020] = "ITU-R BT.2020 YCbCr", + [DRM_COLOR_RGB_BT601] = "ITU-R BT.601 RGB", + [DRM_COLOR_RGB_BT709] = "ITU-R BT.709 RGB", + [DRM_COLOR_RGB_BT2020] = "ITU-R BT.2020 RGB", + [DRM_COLOR_P3] = "DCI-P3", }; static const char * const color_range_name[] = { diff --git a/include/drm/drm_color_mgmt.h b/include/drm/drm_color_mgmt.h index 81c298488b0c..3043dd73480c 100644 --- a/include/drm/drm_color_mgmt.h +++ b/include/drm/drm_color_mgmt.h @@ -78,6 +78,10 @@ enum drm_color_encoding { DRM_COLOR_YCBCR_BT601, DRM_COLOR_YCBCR_BT709, DRM_COLOR_YCBCR_BT2020, + DRM_COLOR_RGB_BT601, + DRM_COLOR_RGB_BT709, + DRM_COLOR_RGB_BT2020, + DRM_COLOR_P3, DRM_COLOR_ENCODING_MAX, };