Cyber1966
Member
Hi, for 10 bit video editing should I be bothered that a monitor is 8 bit + FRC instead of true 10 bit?
Getting back into 10 bit videography after finally putting together a powerful enough editing machine. WAS able to deal with it previously, but painstakingly so.
Been eyeing the BENQ 3200u for which it details 8 bit + FRC.
After researching for over a week it only now hits me whether I should be concerned about that part of its capability.
But then, I'm wondering: Does it matter? I'm not looking to break it down to rocket science and be as concerned with video pixel peeping as I am in photography. At least not to the same extent. The bulk of the world's using 8 bit, aren't they? Thus, the mere fact I'm shooting in 10 bit should curtail any surface concerns? Or is it a matter of, I don't know what I'm missing since I can't even see what true 10 bit looks like as it is?
Ut-Oh, my head's ready to explode here.
Am I'm overthinking things now?
Help!
Getting back into 10 bit videography after finally putting together a powerful enough editing machine. WAS able to deal with it previously, but painstakingly so.
Been eyeing the BENQ 3200u for which it details 8 bit + FRC.
After researching for over a week it only now hits me whether I should be concerned about that part of its capability.
But then, I'm wondering: Does it matter? I'm not looking to break it down to rocket science and be as concerned with video pixel peeping as I am in photography. At least not to the same extent. The bulk of the world's using 8 bit, aren't they? Thus, the mere fact I'm shooting in 10 bit should curtail any surface concerns? Or is it a matter of, I don't know what I'm missing since I can't even see what true 10 bit looks like as it is?
Ut-Oh, my head's ready to explode here.
Am I'm overthinking things now?
Help!