Richard Butler

Richard Butler

DPReview Administrator
Lives in United Kingdom Seattle, United Kingdom
Joined on Nov 7, 2007

Comments

Total: 3912, showing: 61 – 80
« First‹ Previous23456Next ›Last »
On article Still solid: Fujifilm X-E2S Review (229 comments in total)
In reply to:

Peiasdf: Fuji's lack of innovation is really showing. Same old sensor and SoC with poor AF. Cheating on ISO number to cover up the age of the sensor. Rely on film simulation and B&W because X-tran's RAW processing issue.

What evidence do you have that Fujifilm is doing anything that's inconsistent with the ISO standard?

Link | Posted on Aug 9, 2016 at 19:17 UTC
On article Still solid: Fujifilm X-E2S Review (229 comments in total)
In reply to:

TangoMan: The Fujifilm X-E2S, wait a minute... I was young when that was annonced! They still sell those?
I don't understand why DPReview bother to review camera so late in their life cycle. Half a year ago it would have been interesting, but a new generation of hardware (and firmware) has been introduced for a long time now...

Six months into its life cycle doesn't seem *that* late.

Link | Posted on Aug 9, 2016 at 19:08 UTC
On article Still solid: Fujifilm X-E2S Review (229 comments in total)
In reply to:

Jon Porter: I love my X-E2S but it's not a camera for a beginner, who would quickly become overwhelmed by the controls; the Fuji X system is aimed exclusively at enthusiasts. A novice would be much better served by one of Canon or Nikon's lower-end SLRs.

That's exactly what we have tried to address in this review. Does it suit the novice attracted by the lower price and does it make sense for the enthusiast-on-a-budget, given how much the market has moved on?

Link | Posted on Aug 9, 2016 at 19:03 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

notpc: Am I the only one who thinks dpreview's previous test scene was FAR more useful in showing the real differences that I'm looking for in a photo when comparing cameras? When DPR changed scenes, I started using Imaging Resource's test samples because they still use a more typical/natural set of objects (the one with the bottles, crayon box, fabrics, threads, etc) where I can see the combo of noise, contrast, sharpness, etc in the way that I want to. It matters that it's more like a real photograph. At dpreview, most of the scene seems artificial, contrived and flat. I REALLY would prefer to go back to the previous scene. Other than that, the interactive aspect of the comparison tools is excellent.

I think 'the additional shots would mean more work' might be one of the great understatements of our time.

The problem was that the old scene was a nightmare to shoot and our improving comparison tool led to arguments over imprecisions, rather than meaningful differences. It's more likely that we'll try to go back and shoot historical benchmark cameras with the new scene than try to fill in the four year backlog missing from the old scene.

There was a discussion about incorporating the watch and several other aspects from the old scene into the new one, but it was decided that them suddenly becoming smaller ([or, rather, further away](https://www.youtube.com/watch?v=vh5kZ4uIUC0)), might lead to confusion.

Link | Posted on Aug 9, 2016 at 00:29 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

Contra Mundum: What do you do when ACR version changes? Do you ever go back and retest with a newer version?

If there's a significant change in performance, we try to. However, it's a lot of work for each camera and it requires us to notice this change, so it's not guaranteed.

Link | Posted on Aug 8, 2016 at 22:53 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

Contra Mundum: I think you are saying that the raw images have the white balance and brightness adjusted to equalize them. But you are not saying how you do that. Are you basing these adjustment on one selected patch? Do you adjust at every ISO? Do you average the noise? Does Adobe apply the same tone curve to all raw files? Do you modify the tone curve at all?

We use one of the grey patches on the Gretag and place several samplers (a lighter patch for the low light scene). With chroma and luminance NR on, we adjust them to as close to 119,119, 119 as possible (Neutral middle grey). This is done separately for each ISO setting.

We then minimize the noise reduction, export to photoshop and apply standard sharpening there (because ACR calibrates sharpening differently for each camera).

Adobe Camera Raw's own profile tends to honour the camera's tone curve/metering but aren't likely to be entirely consistent camera-to-camera.

Link | Posted on Aug 8, 2016 at 22:42 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

2eyesee: I love the photo of the test scene at the top of the page - I'd be interested in how it was done? Is it just simply combining 2 images, with one semi-transparent?

It was a a 30 second exposure using a vari ND at its darkest setting. Both Sam (in the scene with the lightmeter) and me (the silhouette operating the camera) intentionally moved during the exposure to ensure we weren't too solidly represented in the image.

Link | Posted on Aug 8, 2016 at 22:35 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

notpc: Am I the only one who thinks dpreview's previous test scene was FAR more useful in showing the real differences that I'm looking for in a photo when comparing cameras? When DPR changed scenes, I started using Imaging Resource's test samples because they still use a more typical/natural set of objects (the one with the bottles, crayon box, fabrics, threads, etc) where I can see the combo of noise, contrast, sharpness, etc in the way that I want to. It matters that it's more like a real photograph. At dpreview, most of the scene seems artificial, contrived and flat. I REALLY would prefer to go back to the previous scene. Other than that, the interactive aspect of the comparison tools is excellent.

The problem with the previous scene was that it was too small and too 3D to shoot with the level of consistency we wanted. As compacts started to include wider lenses, the drawbacks of the existing scene were becoming overwhelming (I have a photo of Lars trying to shoot the old scene with a Sigma DP1 and he's pretty much had to climb inside the scene to shoot it).

We did our best to continue to offer all the things the old scene did while also adding new ones (a better low-contrast detail target, different lighting settings). But believe me, it was not an easy decision to make to abandon so many years of existing cameras.

Link | Posted on Aug 8, 2016 at 22:33 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

camcom12: With more high quality fixed-zoom lens cameras appearing over the past 5 years, could the test scene also be used not only at nominal focal length (~50mm equivalent) but also at the camera's widest, half-zoom, and when practical, full zoom? I realize there are technicalities that would need sorting out: e.g. how to properly test max F.L. on superzooms.

That said, DPR's review of the RX10 III has some very useful max-zoom images since studio-restricted images may not reveal all capabilities.

We're looking at better ways of representing lens performance.

We'd need to considerably re-work the comparison tool interface to offer different focal lengths while still being able to offer different ISO settings and lighting conditions (should the image jump back to the standard FL when you change ISO or should we try to shoot all ISOs in two lighting conditions, Raw and JPEG at a variety of focal lengths?)

For now, the this scene won't address lenses.

Link | Posted on Aug 8, 2016 at 22:29 UTC
On article Sony a7R II versus a7 II: Eight key differences (397 comments in total)
In reply to:

William Krusche: There used to be a time when we had to buy expensive magazines to keep up with the newest reviews. Now these reviews are free on sites like Dpreview. If Dpreview does indeed receive some extra 'advertising money' for posting articles like this one, then so they should. If amateurs like me don't need to go out and pay for overpriced magazines any more which were, by the way jam-packed with advertising, then you won't hear me complain.

The post is not, in any way, supported or paid for by any manufacturer. It's merely an article format we're experimenting with.

We often see 'Which camera should I buy' threads on the forum and so wanted to see if we could help shed some light on specific choices. We did Fujifilm X-T2 vs X-Pro2 a couple of weeks back, a7 II vs a7R II this week and we have various other combinations (not always from the same brand) that we're working on.

Sorry if you're not finding them useful, but they are in response to what we thought was a reader demand, not any concerns about marketing or sales on the part of camera makers.

Link | Posted on Aug 8, 2016 at 20:57 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

Ken Croft: Adobe Camera Raw may not get the best out of each individual camera. So is it just for convenience that you settle on the same raw converter for all cameras. It may seem that you are creating a level playing field but would it not be fairer to each camera to get the very best possible performance figures.

It's not a question of 'trusting' ACR, it's a question of showing what you get on a normalized basis with a tool that's central to a lot of people's workflows.

There are a couple of reasons we don't use the manufacturers' bundled software. In the past, many pieces of bundled software would essentially just mimic or exactly reproduce the camera's JPEGs - meaning that we would test the same thing twice and still not get to see what's going on 'behind the scenes.'

Also, much of the software bundled with cameras is dreadful. If you can only get the best results by using software that's slow, buggy or just really difficult to learn, then should we mark the camera down for forcing that upon you?

The reason we make the Raw files available is so that you can download them and see if you can get better results from your preferred converter. But for the most part, we assume the JPEGs are the manufacturers' idea of what's 'best' and that ACR shows a warts-and-all look at the underlying files.

Link | Posted on Aug 8, 2016 at 19:46 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

TheDarmok74: It certainly looks like dpr go to great lengths when it comes to the test scene.
And yet, how can things like the E-P5 test scene happen?
Just look at it, compare the scene to that taken with the E-M5 e.g.
https://www.dpreview.com/reviews/olympus-pen-e-p5/15

Probably shutter shock and if it is I have two questions:
how did you not reshoot this after the FW-fix? And how could you give a camera an award that couldn't take sharp photos a lot of the time?
For me the E-P5 was such a disappointment and with it I questioned the dpr tests.

As I say, I did try to make the problem clear in the review, which originally ended:

*However, its inability to correct image shake at what should be usable shutter speeds means we don't feel able to unreservedly recommend the E-P5. We're hoping an improvement can be made to the camera's stabilization system but, as it stands, there's too much risk of your best shots being undermined - something that's unacceptable at this level. As such, we can't give the E-P5 as high an award as it would otherwise receive.*

But if the result is that you spent your money on a camera that left you disappointed, then it means I didn't spell it out clearly enough, for which I apologise. And I do think it's fair criticism to question the Silver award.

Link | Posted on Aug 8, 2016 at 19:39 UTC
On article Nikon D7200 Review (642 comments in total)
In reply to:

tarunkr: While you've guessed the sensor in D7200 is an updated version of Toshiba one, here in the following review it's showing that it's a Sony sensor like that of D5500!
http://www.lightandmatter.org/2015/equipment-reviews/nikon-d5500-vs-d7200-which-should-you-buy/
It is a bit confusing.

We're still not in a position to be certain, either way. Some pieces of information (Nikon's product photos, some of DxO's measurements), suggest it's a Toshiba, other people are convinced it's a Sony. Either could be correct.

The question is: does it matter?

Our review tries to show you want the performance looks like, in comparison to other cameras and sensors. It doesn't have the banding that affected the D7100 and it shows excellent DR and high ISO performance. We think that's more important than Nikon's supply chain choices.

Link | Posted on Aug 8, 2016 at 18:04 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

TheDarmok74: It certainly looks like dpr go to great lengths when it comes to the test scene.
And yet, how can things like the E-P5 test scene happen?
Just look at it, compare the scene to that taken with the E-M5 e.g.
https://www.dpreview.com/reviews/olympus-pen-e-p5/15

Probably shutter shock and if it is I have two questions:
how did you not reshoot this after the FW-fix? And how could you give a camera an award that couldn't take sharp photos a lot of the time?
For me the E-P5 was such a disappointment and with it I questioned the dpr tests.

I'm afraid I can't remember whether we'd been briefed that the '0 second' anti shock setting was going to be added to the E-P5 when we reviewed it, so I can't be sure what the logic was behind giving it a Silver award.

I tried to spell out that shutter shock was a problem for the camera and that it was serious enough to change our overall rating but I'm sorry that I didn't spell it out strongly enough to prevent you being disappointed.

During busy period with new cameras arriving, it can be difficult to find the time to go back and re-test, check, process and upload an existing camera (especially when a key part of the review hinges on something shown with the existing images). We'll look at how we handled that particular situation to assess whether we should have dealt with it differently.

Link | Posted on Aug 8, 2016 at 17:55 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

Nephi: Do you guys plan on redoing the Canon 5DSR test scene? The images were taken with an older ACR profile that resulted in higher than normal contrast.

We'll certainly look at the current profile and see if there's been a significant change.

Link | Posted on Aug 8, 2016 at 17:38 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

The Squire: @Richard, are you considering a moving studio scene for a similar rigorous test of video quality?

I see you shoot video on the static scene already, which is great for pixel peeping the theoretical best image quality in video, but I figure some animation in the scene would allow us to evaluate how the codec holds up to compression artifacts in complex scenes, lots of movement and low contrast parts - All of which can suffer a lot more when the codec is dealing with variation between frames.

Absolutely. The current test scene is only really useful for understanding the way the sensor is being sampled. We certainly want to be able to test and show more than that, it a controlled, repeatable manner.

Link | Posted on Aug 8, 2016 at 17:16 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

surlezi: @DPR staff
How is lens vignetting handled ?
What is done to correct accross the frame sharpness issues of the lenses ?

The 50mm and 85mm primes we tend to use exhibit very high cross-field uniformity and very little in the way of vignetting. All of our resolution targets are clustered towards the middle of the frame so that compacts are disadvantaged.

Link | Posted on Aug 8, 2016 at 17:14 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

fPrime: I'm not opposed to the "new" studio test scene, but year(s) after launch it still has no older cameras in it's database... only cameras released in the last four years. Although it may be a surprise to the DPR editors, there still a lot of people who still shoot with a D700 or D3s. It'd be great to be able to compare these classics against the D5 or D810 on the comparator, wouldn't it? Similar examples could be found for Canon, Leica, Sony, Pentax, etc. Adding a handful of classic cameras to the comparator would reconnect it with the large user base still enjoying older tech.

It's something we originally hoped to do (and we had a list of benchmark cameras we hoped to test). However, getting hold of cameras known to be in good condition, years after their release was harder than we expected.

Link | Posted on Aug 8, 2016 at 17:05 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

maxnimo: To me the best image test is a group shot with real human faces with hair, and distant trees, grass and flowers.

Sadly that's difficult to achieve all year 'round. Cameras reviewed during the summer would look much better than those shot during the winter (a criticism we've faced about our sample galleries).

Link | Posted on Aug 8, 2016 at 17:04 UTC
On article An introduction to our studio test scene (105 comments in total)
In reply to:

curiosifly: "Raw images are shot using set combinations of shutter speeds and apertures to allow the assessment of sensor performance on a common basis (so at any given ISO, all cameras will receive the same amount of light)."
Does this mean that if the actual sensitivity is different from different manufactures at the same ISO, the brightness of the scene should look different because they all have the same exposure? Fuji is said to overrate their ISO. But its image brightness in raw looks the same as other manufactures at the same ISO. Why is that?

All cameras will show the same brightness because we adjust the white balance and brightness back to a common point (it's very hard to visually compare images that are different brightnesses).

However, since they're all given the same amount of light, this would reveal any differences in performance between sensors, regardless of how the manufacturer calibrates or rates its sensitivity.

Link | Posted on Aug 8, 2016 at 17:03 UTC
Total: 3912, showing: 61 – 80
« First‹ Previous23456Next ›Last »