I am always find posts about color management a bit 'scary', in the sense that I am afraid doing things wrong. So, I do nothing (I am not a professional user).
I have a factory calibrated BenQ PD3200U 4k 32" monitor and Windows notices the monitor and my Nvidia GeForce RTX 2060. These monitor calibration devices used in the factory are (mostly) a factor 10 more expensive than typical home calibration devices. Only disadvantage is that degradation of the monitor is not corrected. But when it looks good to my eyes, it is good for me.
So I leave it like it is. For processing and editing I use DxO Photolab (PL8) Wide Gamut. PL allows to notice when their colors are out of the monitor color space and that is only rarely the case.
I too have a BenQ monitor, older than yours, and I regular generate a new ICC profile using an i1Display Pro and the calibrite PROFILER application (replaces the old X-Rite i1Profiler application). I might agree with your comment "...
when it looks good to my eyes, it is good for me.." however my interest in color management is to have a higher degree of color fidelity from my screen through Photoshop to my Canon printer, such that when I spend time and effort preparing an image for printing I am (hopefully) making edits that produce a better (if not best) print. That is, while "looks good enough" may look ok to my eyes, it may not yield a very good print. At least that's my understand.
I am following this thread's discussion with interest as I may eventually replace my BenQ monitor - which does not support hardware calibration - and so I'd like to better understand how the h/w calibration (profiling?) works compared to the X-Rite/calibrite way.
Peter
PS, my BenQ is the SW2700PT