Previous news story    Next news story

Photos you can refocus: Lytro promises camera within a year

By dpreview staff on Jun 22, 2011 at 19:12 GMT

Startup company Lytro is claiming to be close to launching a camera that allows any point of focus to be specified after the shot is taken. The concept behind the device, called a light-field, or plenoptic camera camera has surfaced regularly over the past few years, but now Lytro, founded by Stanford PhD Ren Ng, says it will have a product ready within a year. The concept uses a series of microlenses to split the incoming light rays across multiple sensor pixels, depending on the angle from which it arrived. This additional information about the angle of the arriving light makes it possible to recalculate different focus points after the image has been shot, but at the cost of lower image resolution. The company hasn't, as yet, provided details such as its system's output resolution. (From the New York Times)

Comments

Total comments: 80
12
kombizz0
By kombizz0 (Jul 27, 2011)

What a beautiful technology. i am sure in very near future Canon can use it in it's photgraphy equipments.

0 upvotes
Thanatham Piriyakarnjanakul
By Thanatham Piriyakarnjanakul (Oct 19, 2011)

Canon, Canon, Canon, Cannot...

0 upvotes
mwitalemi
By mwitalemi (Jul 27, 2011)

neat! too bad about the small resolution...

0 upvotes
luisflorit
By luisflorit (Jul 25, 2011)

I am unable to focus in the black square located at (75%,75%) of the frame from the top left. It either focuses at the square in (50%,80%), or at the square located at (80%,70%).
I think this picture just has 3 or 4 planes of focus. It's like 3 or 4 pictures stacked. So you cannot actually choose precisely where to focus...??

0 upvotes
Dabbler
By Dabbler (Jul 1, 2011)

I'm guessing we haven't even dreamed of the applications for this technology.

0 upvotes
steveclix
By steveclix (Jun 30, 2011)

This has been done for years, but the marketing wasn't geared up and that company was aiming at quality not quantity.

They have 4 or 5 cameras on the market and they even retro-fit all DSLRs with the capture system - so the only thing new is Low-Res point and shoot!

www.raytrix.de

- Steveclix

0 upvotes
Ivanbcarmo
By Ivanbcarmo (Jun 29, 2011)

So revolutionary, that it will even change a lot of photography concepts.

0 upvotes
WinniWood Studios
By WinniWood Studios (Jun 28, 2011)

Watched the interview with the CEO, been on their web - if this holds true, this is a true paradigmn shift for photography .

0 upvotes
Holger Bargen
By Holger Bargen (Jun 27, 2011)

My Pentax K5 has focus bracketing - it should be easy to develop a software to construct something similar in result like the LYTRO technologie - something like HDR for focus.

Best regards
Holger

0 upvotes
Alec
By Alec (Jun 26, 2011)

This reminds me of the "we, real photographers, ..." LiveView discussions and decades earlier ones in other media about autofocus. :-)

This allows you to achieve
* Single-shot stacked focus (everything in focus)
* Single-shot, single-lens 3D (actual 3D depth map - not stereoscopy)
* Adjusting the focus after the fact.

Page 25 of the company's CEO's Ph.D thesis makes it clear what they are doing: http://www.lytro.com/renng-thesis.pdf

Necessarily, this method of capture trades off resolution for ability to "play" with the 3rd dimension and how finely. I.e. if you take a 20MP conventional sensor and use it for this application, if the fidelity of depth control is to be similar to what is indicated on the demo pic above, the image resolution would be somewhere around 1 megapixel.

This sounds like a non-starter; however if one were to take the compact camera or cell phone pixel density and put it on a full-frame sensor, one could counter the resolution problem.

1 upvote
Photato
By Photato (Jun 26, 2011)

Probably good as a niche product and special applications.
If cheap enough can also find a market as a carefree novice camera.

0 upvotes
robax
By robax (Jun 26, 2011)

Investment scam, pure and simple. The guy is laughing all the way to the bank. Whether or not the camera/software combo has any real value is irrelevant to the business.

0 upvotes
Alexei G
By Alexei G (Jun 29, 2011)

not necessarily. Some of the claims are hard to believe indeed, like capturing ALL the light ways in the vicinity even the ones not reach the camera's lens, or the infinitely adjustable focus. In reality, the amount of information recorded will likely be about the same as on conventional still cameras. The concept is interesting though.

0 upvotes
Alexei G
By Alexei G (Jun 29, 2011)

I meant "rays" not "ways" of course

0 upvotes
Meta Magico
By Meta Magico (Jul 10, 2011)

I think one of the points Lytro was making is that if 10 megapixels can be squeezed into a compact camera sensor, then 250 megapixels can be squeezed onto the dimensions of a full frame sensor. No manufacturer does it because it makes little sense. Here's a case where it might.

0 upvotes
panzini
By panzini (Jun 25, 2011)

Reading the comments below, it's obvious that most of you are completely misled.

Plenoptic imaging brings forth more than just "depth-of-field" control. It brings about the ability to change the perspective of the image after the fact, and opens the door to auto-stereoscopy. Changing the focus after the fact is a great feature, the purpose of which isn't to gain more DOF as most of you seem to be obsessed about: There is such a thing as creative use of limited DOF, and the ability to shift the focus in an image is also quite literally the ability to shift the point of attention — changing the character and subject of an image radically.

1 upvote
Emmanuel Luz
By Emmanuel Luz (Jun 25, 2011)

Where's the effect of Lytro here? This is just a DOF and Macro to any good lens?

0 upvotes
RobertBarnett
By RobertBarnett (Jun 25, 2011)

They need to market this to other companies not come out with a one trick pony so-so no name camera.

0 upvotes
MtOlympus
By MtOlympus (Jun 24, 2011)

The best market for this may be traffic cameras shooting moving cars.

1 upvote
niclas åberg
By niclas åberg (Jun 24, 2011)

The ability to "refocus" a photo on the fly is a nicety. The real value lies in an image completely in focus. Perfect for casual snaps of the family. Where can I find out more on the hardware? The site is very vague.

0 upvotes
Joseph S Wisniewski
By Joseph S Wisniewski (Jun 24, 2011)

Ren Ng's camera has slightly less ability to achieve "everything in focus" DOF than you get from just stopping down a lens. His own analysis is about a stop less.

1 upvote
datiswous
By datiswous (Jun 25, 2011)

I think it would be more interesting for macro.

0 upvotes
Nazgman
By Nazgman (Jun 24, 2011)

Things will get even more interesting when industry decides on a standard and couples light field photography to microlensed displays. Then there will be no need to process the images, the viewer's eyes and brain will do all the processing: refocusing, perspective correction, depth perception, etc.

The first steps have been taken for 3D TVs without the need for special glasses, the first steps have been taken for light field photography, now combine the two!

1 upvote
andy le anh
By andy le anh (Jun 24, 2011)

Wow!!! Very interesting!
So we won't have no more Front Focus or Back Focus problem to worry about...

2 upvotes
KenOC
By KenOC (Jun 24, 2011)

So...can software process the image so that all planes are in focus? Infinite depth of field?

0 upvotes
Joseph S Wisniewski
By Joseph S Wisniewski (Jun 24, 2011)

No. People keep asking this. There aren't really "planes", you gain the ability to select DOF in a range defined by the decimated or subsampled lens aperture. For example, Ren Ng's prototype decimated an f4 lens by a factor of 13.5, which made the DOF "range" that of an f54 lens. Except that by the time the math is done, you can't actually get the f54 sample, at best, you get about f28.

So, not only is the depth of field not "infinite", it's actually less than you'd get by just stopping down a lens on a conventional camera.

0 upvotes
Dan Tong
By Dan Tong (Jun 26, 2011)

Of course, at the very least you can do is combine all the planes that you bring into focus with the capability already built into Photoshop CS5.

0 upvotes
sportyaccordy
By sportyaccordy (Jun 23, 2011)

It's like an automatic transmission. For someone who enjoys the art of photography, it's blasphemy. For someone who wants the best pictures of a moment, it's a no brainer.

2 upvotes
paulkienitz
By paulkienitz (Jun 23, 2011)

I notice that in the sample given, the place where I click to refocus really doesn't correlate at all well with where the focus ends up getting put.

1 upvote
RedIsaac
By RedIsaac (Jun 23, 2011)

I'm excited about this. It may not work perfectly,and it may not be for everyone, but developments of this sort are certainly good for photography. There are so many talented photographers on dpreview.com and elsewhere, I can't wait to see what everyone can achieve with this camera.

1 upvote
fillkay
By fillkay (Jun 23, 2011)

The sample seems only to have two planes of focus. There's also a rather obvious grid pattern in the distant plain areas.
Great if it could work and give well resolved images. It would be the first time I'd not regretted choosing a different focus point. I'll be interested to see if Dr Ng can make a success of it.

0 upvotes
fredrbis
By fredrbis (Jun 23, 2011)

I can't get focus except front and back extremes. I.e., nothing in the middle range. Is this an intrinsic short-coming or is the image just a simulation?

0 upvotes
Dan Tong
By Dan Tong (Jun 26, 2011)

I'm pretty sure it is a limitation only of the the demo image. On some examples I have been able to get at least 3 different areas that you can bring into focus.

0 upvotes
Elvis Badin
By Elvis Badin (Jun 23, 2011)

it's the devil I tell you :))

0 upvotes
aardvark7
By aardvark7 (Jun 23, 2011)

Ultimately, most digital camera users are 'lazy' and want instant gratification.
This will likely mean that results will only ever be used to show what has just been taken, on the camera LCD, or posted on-line for others to play with.
Anything else and it turns every user into a RAW photographer, having to process all their pictures, and there is great resistance to that among the vast majority of users.
Indeed, to add choosing point of focus and perhaps depth of field to any adjustments of exposure, white balance and cropping makes the task of post-processing far too long-winded or intimidating for all but the keenest and I think the novelty might soon fade.
I'm certain it will have it's place, as with 3D cameras, but it won't be mainstream anytime soon.

1 upvote
meanwhile
By meanwhile (Jun 24, 2011)

What makes you think you'll have to wait for post-processing to see the results? I think that eventually (that is, not any time soon) there will be a consumer version of this with a touch screen, you take a photo, and then choose on the touch screen right then what you want in focus. Save, done. My idea, PM me in 2020 please, for details of where to send the royalties to.

1 upvote
flipmac
By flipmac (Jun 24, 2011)

You can do that now, like on an iPhone and Panasonic CS: tap, focus, done. No royalties for u.
Back to the subject, I'm sure they'll make a lytro viewer app for touch devices, such as iPads, etc.

0 upvotes
meanwhile
By meanwhile (Jun 24, 2011)

You can change the focus point /after/ you've taken the photo?

0 upvotes
bernieraffe
By bernieraffe (Jun 23, 2011)

This site shows how the images can be re-focussed, I must say I think it's impressive!

http://www.smartplanet.com/blog/science-scope/groundbreaking-camera-lets-you-shoot-now-focus-later/8810?tag=nl.e660

0 upvotes
RichardBalonglong
By RichardBalonglong (Jun 23, 2011)

great idea and tech! But in my opinion, it's not practical and useless for prints. And this technology makes a photographer lazy and less skilled on photographing a subject.

0 upvotes
korayus
By korayus (Jun 23, 2011)

What is the differences with Adobe Magic Lens?
http://www.youtube.com/watch?v=zFTZGaw7rWY

0 upvotes
Joseph S Wisniewski
By Joseph S Wisniewski (Jun 23, 2011)

Adobe cheats a bit.

Ren Ng. specified a microlens speed equal to the lens speed, so the plenoptic projections on the sensor are minimally overlapping circles. Adobe uses longer microlenses, which allows overlap. This limits the refocusing effect.

So, a Ren Ng setup, with an f4 main lens and f4 microlenses at 13.5x the pitch lets you refocus your f4 slice anywhere within the DOF of an f22 lens.

The Adobe setup, with the same number of microlenses, lets you refocus anywhere within a narrower DOF range, say an f4 slice of an f11 DOF. It lets you "tweak" focus, but doesn't allow for the insane examples that Dr. Ng shows off. On the other hand, that approach has less effect on resolution. But it's even more computationally intensive, and I'm not sure what the payback for the Adobe approach is. Aside from not having to breech the sensor to build one. ;)

3 upvotes
Franka T.L.
By Franka T.L. (Jun 23, 2011)

Well , interesting development upon imaging side , but I am not seeing Nikon , Canon or anyone being threatened. It will still require a decent lens, a decent camera, and mostly this require a highly resolved sensor level capture to give us equivalent of todays.

The technology is useful, especially for many industrial / surveillance / scientific need. But its not going to replace good old photographic capture the old way.

1 upvote
miketanct
By miketanct (Jun 23, 2011)

Macro shooters will rejoice. Shooting stacks will be a thing of the past!

0 upvotes
Joseph S Wisniewski
By Joseph S Wisniewski (Jun 23, 2011)

Not even close. Plenoptics only gives you the ability to decrease DOF from a maximum DOF set by the parameters of the microlens array. Dr. Ng's prototype could select DOF from f22 to f4, and to do that, he paid a terrible price in resolution, reducing a 16mp camera to 0.088mp.

If you can tolerate resolution that low, you can stop a lens down to f300 on FF, f200 on APS, and have 10-15 times the DOF of Dr. Ng's camera. That's how P&S cameras with tiny sensors get away with such huge DOF in their macros, by lowering resolution. Not to this silly an extreme, though.

Stacking goes in the other direction. A 100 shot stack at effective f11 (still critically sharp on APS) has the DOF of f*N/2 = f550, but the resolution of f11.

0 upvotes
rfsIII
By rfsIII (Jun 23, 2011)

Maybe it won't be this exact camera or company, but this concept, once the necessary hardware horsepower has been developed, could put Canon, Phase One, Nikon, Leica, and all the others out of business. It will finally be possible for photographic artists to achieve their vision without the hassles of switching lenses, and bad focus, and lens aberrations and all the other baloney we have to put up with. Friction-free photography is almost here.
Hurry Dr. Ng, hurry.

1 upvote
Joseph S Wisniewski
By Joseph S Wisniewski (Jun 23, 2011)

No, it won't put anyone out of business. ;)

It doesn't let you get around switching lenses. In fact, since the plenoptic process dramatically decreases resolution, it reduces your ability to crop images to size, and increases the need to frame tight, so you zoom more and switch lenses more.

As far as lens aberrations, the only one of those that plenoptics addresses is curvature of field, and it addresses that incompletely.

0 upvotes
Alex Panoiu
By Alex Panoiu (Jun 23, 2011)

Correction -- spatial resolution reduction in the example is by a factor of 64, from 256 megapixels to 4 megapixels. Much better.

0 upvotes
Alex Panoiu
By Alex Panoiu (Jun 23, 2011)

From page 60 of the thesis (http://www.lytro.com/renng-thesis.pdf) I understand that if each microlens covers m x m pixels and the image was taken at f/N then the image can be refocused anywhere in the depth of field of f/(N x m). For example, the image taken at f/4 and each microlens covers 8x8 pixels then the image can be refocused anywhere in the depth of field of f/32. This also means that spatial resolution is reduced by a factor of 256, say from 256 megapixels to 1 megapixel.

The tradeoff is that for each factor of 2 of loss in spatial resolution the "potential" depth of field increase corresponds with one step of aperture.

1 upvote
Gao Gao
By Gao Gao (Jun 23, 2011)

256? or 64?

0 upvotes
Joseph S Wisniewski
By Joseph S Wisniewski (Jun 23, 2011)

N-1, actually. Elsewhere in his thesis, Ren Ng comments that it's not possible to get the last stop of DOF out of a real system.

So, the f4 lens, 8x8 pixel microlens system gives you DOF from f4 to f16, not f32.

0 upvotes
maxotics
By maxotics (Jun 23, 2011)

What did Werner Heisenberg say, "What we observe is not nature itself, but nature exposed to our method of questioning." There are other methods to getting multiple focus, like simple burst shots at multiple focus points, or aperatures. They all cost TIME. As one poster said, the amount of directional data you need to collect would necessitate a huge sensor and the large the sensor the slower the camera (to process the data).

For serious photographers, they'd rather make a quick judgement of focus shots and aperature than wait for a camera to get enough data to change later.

Heisenberg might point out that you can either get prepared to take another shot in 1 second, or wait for the previous shot to get all the focus points, but you can't do both.

0 upvotes
CharlesGoodwin
By CharlesGoodwin (Jun 23, 2011)

It's not that everything is in focus, it's that the selective focus can be edited. Of course, isn't that what you can do with Photoshop's lens blur filter and a reasonably sharp photo (say a 28 to 35 mm shot @ f4 or f5.6)? Point is, with a reasonably wide angle lens stopped down a bit, everything is in reasonable focus and it's not too hard to blur is.... This seems more like a gimmick than an innovation. Now, is this could be adapted to generate something where everything in a snap shot really is in sharp focus, that would be remarkable.

2 upvotes
lylejk
By lylejk (Jun 23, 2011)

Your latter part of you statement is exactly what plenoptic does. Think of it as a Fly's eye (bunch of circular lens) each having a slightly different focus. All these images are then processed in such a way that you can change the point of focus at will. It does require multiple sensors of course for each of the circular lenses, so right now, the image you get is relatively lo rez compared to modern current sensors. :)

0 upvotes
webfrasse
By webfrasse (Jun 23, 2011)

Yep, a Stanford researcher missed that little fact...in fact, Stanford missed it. Give them a call;-)

0 upvotes
lylejk
By lylejk (Jun 23, 2011)

Ratrix already makes a camera that does this; read about it in PopSci and posted the blog a few weeks ago. Will cost your $30K (not a mistake) though. lol

http://forums.dpreview.com/forums/read.asp?forum=1006&message=38644917

0 upvotes
f_stops
By f_stops (Jun 23, 2011)

The technology is interesting . . .

But as a useful 'breakthrough" I would rank this a little above "Smile Detection." and well behind "Live View."
Camera makers are not even close to perfecting autofocus yet.

2 upvotes
Marla
By Marla (Jun 23, 2011)

You can see more on their website.....including an article, video and some more images to play with.

http://blog.lytro.com/

Maria

0 upvotes
Chuck Lantz
By Chuck Lantz (Jun 22, 2011)

Is this essentially a sort of "focus bracketing", but all within one set of image information, whereby a range of sharp focus points are all within that info set, and the user then is able to choose which of the various images within that set works best?

If so, then I wonder how wide the bracket is? It seems there would be a limit, if only to keep the files smaller.

0 upvotes
LVPhoto1
By LVPhoto1 (Jun 22, 2011)

Boy! Does this mean that the average photographer can focus now…"wow" too much!!

0 upvotes
MarcMedios
By MarcMedios (Jun 22, 2011)

This is one of those "Holy Shit!" Moments. This completely changes photography. I honestly don't know how someone can say "it's just a matter of finding the right algorithm". This is major major stuff, a radical change from what photography used to be, form what we used to think pictures needed to be. Amazing.

3 upvotes
lbjack
By lbjack (Jun 22, 2011)

Guess you like taking what others say out of context so you can score points. OF COURSE it's major stuff, and I say so.

1 upvote
lbjack
By lbjack (Jun 22, 2011)

Alas, the photo here is a cheat, but this really IS cool. If the data -- i.e. the photons -- are there, then it's just a matter of finding the right algorithm to make them coherent. Of course it's extremely complex in application, and Mr. Ng is a genius -- a new Edwin Land.

Blurb on their Website: "Lytro will debut the first light field camera for everyone. OK – you’re not everyone. You are a beautiful, unique snowflake. And you deserve an amazing camera...."

Love 'em already!

1 upvote
Eridan_Fetahagic
By Eridan_Fetahagic (Jun 22, 2011)

Really cool.

0 upvotes
James Madara
By James Madara (Jun 22, 2011)

In the future we could change depth of field like white balance in camera RAW. I like the idea of clicking where you want the focus even if it is a Flash magic trick.

0 upvotes
Martin Budden
By Martin Budden (Jun 22, 2011)

Although changing the point of focus is a nifty feature, this seems to have overshadowed something else - the camera lens can be fixed focus. This means the lens can be smaller and cheaper and also, importantly, there is no waiting for the camera to focus.

3 upvotes
alwye
By alwye (Jun 22, 2011)

Digital over film---if you don't like the shot, take another one, or another 10.
with this tech, you don't even have to take another one, just re-focus. Really, it impossible for you to take a bad photo.

0 upvotes
Cy Cheze
By Cy Cheze (Jun 22, 2011)

Not so. A low light photo could still be blurry, no matter what the focus setting, due to tiny sensor, hand jitter, or subject motion. You cannot unblur the blur any more than turn hamburger into a living cow.

0 upvotes
alwye
By alwye (Jun 22, 2011)

Not yet. In 10 years, you might...

0 upvotes
ptodd
By ptodd (Jun 23, 2011)

It has been suggested that similar techniques can help with motion blur of the camera and within the scene; as mentioned about around 2:20 in http://www.youtube.com/watch?v=-EI75wPL0nU

0 upvotes
JWest
By JWest (Jun 22, 2011)

Notice how the Flash widget is actually presenting a number of discreet layers, and using a visual effect to "refocus" between these layers depending on where in the image you click.

For example, try focusing on the back wall. You'll not get the focal point any further away than the rear flask. Also, try refocusing by clicking on various points along the right hand edge of the workbench, where the holes are. You'll find that focus always moves to the rear flask, regardless of how far forward you click.

This is certainly an intriguing concept, and the science behind it is perfectly sound. But I'm not getting my hopes up until I can hold one of these cameras in my hands and see the results for myself.

2 upvotes
pbolton
By pbolton (Jun 22, 2011)

you can read up on the technology at
http://graphics.stanford.edu/papers/lfcamera/lfcamera-150dpi.pdf

a down side is that the camera will have a fixed f number as the f number of the main lens and the micro lenses need to be the same (my guess is that the first model will be at f/4)
getting this to work with a zoom will require lots of tradeoffs

this is for the point and shoot market and not for the DSLR folks
seems that this could be adapted to cell phone type cameras as well

1 upvote
Michal59
By Michal59 (Jun 22, 2011)

Interesting. Have been following this concept for several years. Seems very promisive from technical point of view. For me, the theory is a bit complex, but one can easily imagine it's usability in technics. Or, maybe, in the future, fully featured cameras will let us correct wrongly focused, precious images. AFAIK however, they use 144 pixels to calculate one effective picture pixel, so you need 1.440.000.000 pixel chip to produce 10Mpix refocusable image :)

1 upvote
Total comments: 80
12