Using PDAF pixels for depth mapping, bokeh simulation

Started Jun 14, 2018 | Discussions thread
(unknown member) Contributing Member • Posts: 975
Re: Google's results can't compete with a dual camera

I don't have a Pixel 2, but I have seen a lot of reviews that show that the Pixel 2 generates worse depth maps (for objects that are more than a meter away) than the iPhone. In my opinion it looks awful and I would prefer the iPhone in this regard. And the other link that I posted confirms that multiple users have the same issue. Even Dxomark's sample images show this:

Pixel 2:

iPhone 8 Plus:

Dpreview's opinion:

"The new Pixel 2 fares the worst in this comparison, with multiple aritfacts throughout the image."

Or look at at 6:23. Looks like exactly the same issue.

Google actually even shows such issues in their paper and they make lens abberations and sensor defects responsible and also mention that this dual pixel method only works for "nearby scenes" or "macro-style photos". The authors also say that "The small stereo baseline of the dual-pixels causes these [disparity] estimates to be strongly affected by optical aberrations."

By the way I also don't like Google's approach that they don't want to simulate depth of field realistically. In the paper they indicate that they ignore realistic depth of field in order to have more sharp people/dogs in the photo. I don't like this. And it's an inconsistent philosophy because Google adds noise to the blurred background for a more realistic photo. Apple isn't that different. Apple only applies blur to the background and even applies blur to the chest when it is in the focal plane.

Post (hide subjects) Posted by
(unknown member)
(unknown member)
(unknown member)
(unknown member)
(unknown member)
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark MMy threads
Color scheme? Blue / Yellow