I'm not sure I understand the point you're making, Iliah.
Point is, you can't check clipping or banding visually on a display because of the display device capabilities.
Of course, but the point of my comparison wasn't to show differences in clipping or banding but, rather, to show the global tonal differences brought on by a fairly typical adjustment in PS.
I'm afraid I do not understand. Since display device capabilities put the limits, the way out is to extract colour list and plot clipping and banding statistics.
Perhaps the problem is that you're approaching this isssue like a good color scientist and I'm approaching it like a photographer/artist here with typical tools in-hand.
The problem is, not all banding is cerated equally or from the same source.
There is of course image banding. But there’s also banding that is produced in the video path of
some products.
With a device like say an NEC SpectraView, a high bit panel and especially if the entire video path is high bit (card, OS, software), there should be zero on-screen banding from the display path. If you see banding, it’s in the data. But as other’s suggested, showing how much is better left to measuring all this.
I generally limit myself to ACR/PS and associated plugins in my pursuit of a satisfactory rendering for my intended output display and seldom worry too much about the technicalities like the inaccuracy of PS's gamut warning for at least some color spaces, for instance.
PS’s gamut warning is both buggy and inaccurate! Same with Lightroom’s implementation. The methods many are using here to show OOG is far more accurate and doesn’t treat a tiny OOG and a massive OOG the same.