I'm not opposed to shooting RAW occasionally. I'm opposed to the theory that pros are supposed to shoot all RAW at all time. I use RAW from time to time, for available light shutter speed, where I know the picture as captured would have to be way under-exposed for the necessary shutter speed, only for a few specific shots. When blowing highlight is the concern, using all-RAW is a poor crutch: one should think of:
1. expose better;
2. bracketting;
3. introduce sufficient light for proper flash/ambient balance.
Unlike at the shadow end where clumping could become an issue when data is being rescaled, the real gain at the highlight end from shooting raw if miniscule . . . and in order to get that, one has to shoot all RAW because over-exposure is by definition a mistake (unlike intentional under-exposure in order to gain shutter speed, where one can switch to RAW just for that shot, then switch back).
No one's perfect, but pefect each shot is not what the real world demands either. Do you shoot a 60 frames per second camera because no one's perfect at getting the perfect moment? The gain from raw is miniscule anyway, RAW does not preserve all 14-17 stops that the human eye can see. What it does is re-slicing the stops that the sensor is capable to capturing. For that, you have to make a number of trade-offs, especialy relating to data security if you have a busy studio.
As for arrogance and ignorance, please . . . you are projecting . . . perhaps you are too cheap to buy additional cards yourself, but at my end, I'm partially switch to a real "efilm" system, where the cards do not get recycled at all. Yes, a memory card gets used only once, then goes straight into archive . . . for ultimate data security.
Excluding all pros who only shoot raw, well, I'm not shooting exclusively in JPEG, some pros with far higher credentials than my own do. Judging artists by the tools that they use is, well, how shall I put it, arrogant and ignorant.