I posted my views on this in another thread dealing with this very
same subject.
http://forums.dpreview.com/forums/read.asp?forum=1019&message=24438663
I have never seen any convincing evidence that humans, especially
adults, can tell the difference from 16-bit 44kHz and 24-bit 96kHz or
192kHz audio. All I've read is ravings about the "high resolution"
thing that border on the supernatural, about "ultra frequencies" and
such, some even mention non-human animal hearing to support their
case. I'd like to see a properly done double-blind test on this.
I think studios use high resolution, high-frequency sampling audio
because they edit. The same reason we do 16-bit editing in photoshop.
But also, the thing is that even if some people could tell the
difference, they would be a minority, and furthermore it would
require thousands in excellent audio equipment, so it's a minority of
a minority. To me, consumer "high resolution" audio was dead from the
start, no matter the "format war". What are your thoughts?