Let's face it: if there was a bias against Oly, then how come their
lenses keep getting such good reviews? The reason is that lenses
don't involve many ergonomic considerations, and characterising their
performance is a lot less complex and subjective than reviewing a
camera body.
I'd argue that the reason is lenses are tested according to strict technical criteria, and then scored against those criteria. I could pick up a lens and dislike its finish, dislike its action when zooming/focusing, dislike the colour it's painted, dislike its hood, and dislike any tripod mount it may have.
But if all of those things function well enough, and it gets good test results, then I have to write it up as a good lens - otherwise, the data and my review conflict and I have a bad review.
I may throw in a note at the end saying I dislike these various things, but I will
have to make it clear that these things are my personal choice, and that it got good test results.
Hence, the apparant "bias" against Oly isn't really bias at all -
it's just the inevatible subjectivity that will creep into any review
of a camera that is done by a human being.
Subjectivity creeps in, yes.
But camera bodies do not appear to be reviewed in the same way as lenses. There is a lot more subjectivity involved. Some tests involve timings or image analysis, but even those don't necessarily free them of perception as they're not always precise tests.
That allows personal perceptions and preferences to be reinforced.
This can be removed, and one good way to do so is to make the reviewing process revolve much more around data.
For example, I recently entered a photographic scavenger hunt. Each person submits 10 photos, one for each of the topics. With about 120 photos to review before I could cast my vote for the winner of the best set, I had to do something to make it easier. So I decided to simply score each photo from 1 to 3 - 1 being a poor shot for the topic, or a poor photo. 3 being excellent.
The aggregate scores for some sets changed my view of whether they were good or bad - a strong steady flow of 2s with an occasional 3 meant that they could actually edge ahead of someone with one or two 3's but then a lot of 2s and 1s. The scoring system allowed me to average out the good and bad aesthetic reactions I had, and determine who had done well overall - not just gotten a reaction on one or two photos.
Without that, I'd probably have just judged each person's 10 photos on the best or worst reaction they gave me.
But I've become more and more convinced that most product reviewers have no such system, unless they find themselves doing a "group test". This is probably because it would be very difficult to produce an in-depth system for reviews, given how complex cameras are - and how rapidly they innovate.
Even so, I think a good system would probably be a simple "1 is basic, 3 is excellent" scoring system across a range of simply defined items like focus (static and tracking being different items). I say better because you'd probably find that most cameras at a particular level would always score very similarly - an E520 is going to get a similar score to any of its competitors in most areas with this system, and the few areas where it gets a 1 (focus) may well be offset by a 2 or a 3 in another area (size? configuration menu?).
The overall score may help the reviewer be more positive about the camera, or at least point out where its failings are. And the lack of a wide score range means that this isn't reduced to statistics masturbation - something is either poor, passable or excellent.
Note that every camera does already get a score at the end of the review, and some scores for specific areas too. But there's no criteria or testing procedure published for how that figure is arrived at. They could be rolling dice for all we know...
Of course, the other issue is that this system of scoring needs to be applied rigorously. Each camera needs to be evaluated in the same way, against common criteria. And whilst the image tests make it appear that this is so, I'm not convinced that the reviewers are handling and testing the functionality of the cameras with the same impartial rigour that they apply in the imaging tests.
--
Check my poster's profile for gear details:
http://forums.dpreview.com/forums/postersprofile.asp?poster=hgiuidiuhdie