Digital lenses, any experts in optics?

  • Thread starter Thread starter Rinus Borgsteede
  • Start date Start date
R

Rinus Borgsteede

Guest
This is a diagram that was presented to me on another post and it shows the newer type of design that could accomodate CCds better than the older generation of lenses.
The idea is that the light froma lens strikes the CCD straight on.

My understanding of optics is limited but somehow I cannot see how the bottom design can work.

Light travels in a straight line and the rays in the corners must emmanate from the center of any lens.

I do not need theoretical answers, I already see that in the theoretical diagram.
Is there anyone that can explain this?
Rinus of Calgary

 
This is a diagram that was presented to me on another post and it
shows the newer type of design that could accomodate CCds better
than the older generation of lenses.
Well, there's a whole bunch of things wrong with these diagrams.

First, a ray of light does not just travel in straight lines, as those diagrams indicate. Every time it passes from one medium to another (air to glass, glass to different kind of glass) it changes direction according to well defined laws of refraction.

If you can compute (or enlarge the pictures and measure) the angle of incidence of the ray with each of these interfaces, you will see exactly where the ray enters and leaves each element of the lens.

Second, the diagram radically overstates the differences between the two lens designs. A design such as the bottom one will have its light strik the film at a steeper angle than the first design, but it's not as radical as the pictures. The first lens won;t really look like a radiating point, and the second lens won't look like a perfectly paralle spotlight. Reality is somewhere in between.

Third, it's not a "old and new" situation. Both of the lenses illustrated are old designs. The first is a "symmetrical" lens. Which means its rear elements are going to be more or less the same distance from the film or sensor as the focal length of the lens. (I say more or less, because a multiple lement lens actually "radiates" light from a mathematically calculated point called the rear nodel). The second diagram is a "retrofocus" lens because it focuses onto a location farther from the film than would be expected from its focal length.

In general, for a given amount of glass, the symmetrical designs outperform the retrofocal designs, so retrofocus lenses are only used when needed.

SLRs "need" retrofocus lenses when the back elements of a wide angle lens would interfear with the SLR mirror.

Digital cameras "like" them even when there's no physical interfearance, because the retrofocus design means that light is "more perpendicular" to the sensor than light from a symmetrical lens so there is less vignetting in the corners, and less exageration of chromatic abberation.
The idea is that the light froma lens strikes the CCD straight on.
My understanding of optics is limited but somehow I cannot see how
the bottom design can work.
Light travels in a straight line and the rays in the corners must
emmanate from the center of any lens.
Again, light doesn't travel in a straight line, it makes a turn at every "interface", every place where it passes from air to glass.
I do not need theoretical answers, I already see that in the
theoretical diagram.
Is there anyone that can explain this?
Rinus of Calgary
Probably not, without drawing some diagrams of my own.

Ciao!

Joe
 
Joe I would like to use Rinus post to ask you a question

I was wondering would there be any advantage to going with a curved CCD array and a curved field lens like in biological designs and some panoramic cameras?

If the CCD would like the light to come at it more straight on would it help for these arrays be more concave?

Works in nature

Kraig
 
Joe I would like to use Rinus post to ask you a question

I was wondering would there be any advantage to going with a curved
CCD array and a curved field lens like in biological designs and
some panoramic cameras?

If the CCD would like the light to come at it more straight on
would it help for these arrays be more concave?
It would. A curved sensor would also be a real ***** to manufacture.

Petteri
--
http://www.seittipaja.fi/index/
 
First, a ray of light does not just travel in straight lines, as
those diagrams indicate. Every time it passes from one medium to
another (air to glass, glass to different kind of glass) it changes
direction according to well defined laws of refraction.
In both drawing they do use th eoptical centre of th elens system, and all rays going through the optical centre won't be bend.

So, your reasoning doesn't show a bit of understanding.

jacques.
 
Hi Rinus,

You are totaly right, all this writings about special digital lenses is just a bunch of stupid commercial b.lsh.t.

A lens system can be brought back into a virtual single lens doing the same, well having the same f and the same optical centre.

For every point of the subject, one just have to draw th ethree construction lines, the one through the optical centre, which is going straight as it's not bended, the second parallel to the optic ax, bended in the direction of the focuspoint behind the lens and the third through the focuspoint in front of the lens bended in away it will become parallel the optical ax, afther the lens. Well do this for evry point of the object and you will construct the image. Well I'm sure you know this too.

All parts of the virtual lens will b eused for every point of th eobject to get a single point of the image, well for all points of the object being in the focal plane.

The commercial boys and girls, knowing not a thing about optics, are trying to create selling points by making stupid drawings, the same kind of drawings are used for promoting special digital lenses as well as promoting the 4/3 system.

jacques.

But as the majority of the people never ever have studied the basics of optics, well they can come away with it, and during a coffee some of your colegues will tel you what kind of great lenses they have invented. ;-)
 
Hi Rinus,

You are totaly right, all this writings about special digital
lenses is just a bunch of stupid commercial b.lsh.t.

A lens system can be brought back into a virtual single lens doing
the same, well having the same f and the same optical centre.

For every point of the subject, one just have to draw th ethree
construction lines, the one through the optical centre, which is
going straight as it's not bended, the second parallel to the optic
ax, bended in the direction of the focuspoint behind the lens and
the third through the focuspoint in front of the lens bended in
away it will become parallel the optical ax, afther the lens. Well
do this for evry point of the object and you will construct the
image. Well I'm sure you know this too.
Yes, I agree and I also have to say that any image point must by the laws of optics pass through the center of the lens where the sizeand location of the aperture determines the ultimate angle at which the lightbeam strikes.
All parts of the virtual lens will b eused for every point of th
eobject to get a single point of the image, well for all points of
the object being in the focal plane.

The commercial boys and girls, knowing not a thing about optics,
are trying to create selling points by making stupid drawings, the
same kind of drawings are used for promoting special digital lenses
as well as promoting the 4/3 system.
The so called back focus or distance of lens mount to film is of no consequence at all except for making smaller camera bodies and the easier accommodation of wider lenses.

Obviously retro focus designs could be introduced to all the newer lenses to standardize the distance of the virtual nodal points so that CCDs can be made with predictability in regards to compensated (probably reverse programmed) CCD fall off. Maybe even an extra pin to tell the body the type of fall off from a particular lens.
Thanks Jaques
Rinus of Calgary
jacques.

But as the majority of the people never ever have studied the
basics of optics, well they can come away with it, and during a
coffee some of your colegues will tel you what kind of great lenses
they have invented. ;-)
 
Hi Joseph,

With light travelling in a straight line I mean that it goes at all times through the center of the lens. Obviously every element redirects the single point light beam for correct convergence. (at least that is the ideal)

If I would send a laser beam through the lens as seen in these samples, I cannot see how an off center beam could come out of that lens the way it is suggested in the bottom sample.
As such the diagram is incorrectly predicting the path.

I cannot even see that it is partly correct but I have an open mind and am quite willing to learn new laws of optics.

The issue of retrofocus (for wider lenses) is probably the only real help that the CCD is going to get as it keeps the angle of light rays farther away than standard design lenses.

Ultimately, the possibility to make smaller lenses for CCDs is a wrong assumption as longer designs (of wide lenses) would be much more forgiving.

Thanks Joseph,
Rinus of Calgary
 
The diagram is an very gross oversimplification but there is a point it is trying to make. Yes the light bends at the interface, but that is not the point. Basically the point of this diagram (I think first publicised by Olympus) is to cut the angle of light stricking the sensor.

The issue is the angle at which the light strikes the sensor. Various sensor technologies have different affects due to the angle at which the light strikes it, the so called "acceptance angle." Film on the other had does not have this sensitivity. As the angle gets steep, some sensor technologies work less well. There may be other techniques such as micro-lenses that can count-art this acceptance angle issue.
This is a diagram that was presented to me on another post and it
shows the newer type of design that could accomodate CCds better
than the older generation of lenses.
The idea is that the light froma lens strikes the CCD straight on.
My understanding of optics is limited but somehow I cannot see how
the bottom design can work.
Light travels in a straight line and the rays in the corners must
emmanate from the center of any lens.
I do not need theoretical answers, I already see that in the
theoretical diagram.
Is there anyone that can explain this?
Rinus of Calgary

 
In both drawing they do use th eoptical centre of th elens system,
and all rays going through the optical centre won't be bend.

So, your reasoning doesn't show a bit of understanding.
Well, it's obvious that someone here doesn't show a bit of understanding. I'm going to bet it's not the guy who wrote his first ray tracing lens design program back in 1987 while working on infrared optics at Microdot.

That aside, let's get back to first principles.

First, any lens of non zero thickness has two "nodal points". Chapters 5 and 6 of Hecht and Zajac, "Optics", 1974 discuss this in considerable detail. (I have other books that are more up to date, but Hecht is so easy to follow).

This is true even of a single lens element. (Look at the illustration on p. 168).

In order for a lens to have an "optical center", the lens would have to have zero thickness. Since neither of Rinus's diagrams show fresnel or holographic elements, we'll conclude that this isn't the case.

I'm not sure what the first diagram is. It could be an Angulon, but it looks more like someone jsut screwed up drawing a common double Gauss.

The second is a simple (and horrible) wide angle, basically just the first lens with the last positive element removed to increase the back focus. Simple, easy to analyze (and high in abberations). The rear node shifts back, we move the whole lens forward, and incident angles become more perpendicular to the lens.

What are the educational standards in the Netherlands? In the US, even highschool physics included enough optics to identify nodal points and perform basic geometric optics such as ray tracing.

Ciao!

Joe
 
Joe I would like to use Rinus post to ask you a question

I was wondering would there be any advantage to going with a curved
CCD array and a curved field lens like in biological designs and
some panoramic cameras?

If the CCD would like the light to come at it more straight on
would it help for these arrays be more concave?
It would work great in a digital camera too, simplify the whole rear section of the lens.

But, as Petteri pointed out, building the sensor would be a bloody pain.

As well as needing a whole new batch of lens designs.

And where do you put the shutter?

Ciao!

Joe
 
It would. A curved sensor would also be a real ***** to manufacture.
Yeah, it would.

But maybe not, if we revive an old idea. In the days before flat displays (LCD, plasma, even flat fronted CRT) they used to turn a curved front CRT into a flat CRT with a fiber optic faceplate. Just a bunch of optical fibers, bound into a plate, ground and polished flat on one side, and concave to match the convex CRT on the othe side.

This could be reversed, the flat surface faces the sensor, the curved surface faces the lens.

Or maybe it's too late in teh day for such thought and I'm just rambling.

Ciao!

Joe
 
In both drawing they do use th eoptical centre of th elens system,
and all rays going through the optical centre won't be bend.

So, your reasoning doesn't show a bit of understanding.
Well, it's obvious that someone here doesn't show a bit of
understanding. I'm going to bet it's not the guy who wrote his
first ray tracing lens design program back in 1987 while working on
infrared optics at Microdot.

That aside, let's get back to first principles.

First, any lens of non zero thickness has two "nodal points".
Chapters 5 and 6 of Hecht and Zajac, "Optics", 1974 discuss this in
considerable detail. (I have other books that are more up to date,
but Hecht is so easy to follow).

This is true even of a single lens element. (Look at the
illustration on p. 168).

In order for a lens to have an "optical center", the lens would
have to have zero thickness. Since neither of Rinus's diagrams show
fresnel or holographic elements, we'll conclude that this isn't the
case.

I'm not sure what the first diagram is. It could be an Angulon, but
it looks more like someone jsut screwed up drawing a common double
Gauss.

The second is a simple (and horrible) wide angle, basically just
the first lens with the last positive element removed to increase
the back focus. Simple, easy to analyze (and high in abberations).
The rear node shifts back, we move the whole lens forward, and
incident angles become more perpendicular to the lens.

What are the educational standards in the Netherlands? In the US,
even highschool physics included enough optics to identify nodal
points and perform basic geometric optics such as ray tracing.
I do not konw about the Physics in highschool as theDutch have a different system of schooling.

I went to a special Graphic Arts Academy and then a school for Photography where I learned most of my early knowledge. Optics are a fun thing to get into but the puzzle seems to be for the marketing people that have no idea of how to convey simple optical laws and they do not care either.

Nice bright color (with a very high useless contrast as side effect) was sold to pretty much everybody that lived under the Kodak umbrella. It is a successful campaign even if it is totally misleading and at the least inaccurate.

Either way, it is fun to muse about these things and see where the general public stands on these matters.
Rinus of Calgary
Ciao!

Joe
 
Thanks for the response Joe and Petteri
Some slide projector lenses do this.

Could the shutter be a leaf type in the lens like that on all but my 35mm/DSLR gear? Do we need an actual shutter? Just let the arrays have an electronic timer.

Some are designing whole new lenses system for digital anyway

Making the chips

These are custom chip lines already I thought. Could they form them on over a curved mold or build them on something pliable then add the sphere shape? I don?t think they would need much of a curve

If all this was possible could there be a curved mirror for an SLR design or would it need to be a fixed lens camera?

Inquiring mind here not a claim of actual understanding of how they make this stuff, or I would get my thinking back in the box.

Kraig
 
Kraig wrote:
[snip]
These are custom chip lines already I thought. Could they form
them on over a curved mold or build them on something pliable then
add the sphere shape? I don?t think they would need much of a curve
Unfortunately not, as far as I know. Chips are made on a semiconductor substrate, usually silicon, but there are others such as gallium arsenide. This substrate is crystalline and mineral. The idea is that you grow a crystal, then cut it into thin wafers, and then use techniques surprisingly similar to enlarging photos in a darkroom to "print" and etch the circuits on them using a variety of materials. I don't know of any pliable semiconductor, and it would be extremely complicated to cut a concave wafer out of the crystal. Wafers are rather expensive as it is: somebody quoted a figure of $1000 per wafer (of course, each of them makes a fair bunch of circuits): I'm sure that a concave wafer would cost maybe 10...100 times as much... not even considering the actual circuit printing issues.
If all this was possible could there be a curved mirror for an SLR
design or would it need to be a fixed lens camera?
It would need to be curved, I guess, but I don't see a problem there.

[snip]

Petteri
--
http://www.seittipaja.fi/index/
 
Hi Joe,

I always use principle of KISS.

It's not making a point to do quantum, wave mechanics, trying to use Schrodinger equations, or trying to apply string theory to this.

The basical drawings are even at the most simplified level totaly wrong.

Never used ray tracing, but I can imagin it's like using the old fashioned Snellius for each ray at every medium,x to medium,y surface, lot of computation and as I studied I had to use a slide calculator. ;-)

jacques,

no abuse meant.
 
I believe that I was the one answering Rinus question and posting the lens drawings ... here's the original posting (fixed some misspellings):

"Rinus: What i am saying is that to be perfect, you have to create a new generation of lenses anyway. Placing the lens too close to the film plane will force light to strike the sensor on a great angle which will create artificial chromatic aberration and bleeding into neighboring sensing areas ...this is a fact. What we have to do is to minimize the variation of the angles. In film emulsion the problem is taken care of by the random distribution of grains that average-out the effects.

Possible or not...take a look at this picture:

http://www.pbase.com/image/1494688

Get it? To really get the most out of your lens it has to be taylormade for the sensor (MTF, Anti-aliasing etc) ...but who cares about perfection ..???

I will not buy a DSL. " (End of original posting)

So ...what I state here is that you have to minimize the possibility that light strikes under angles greatly deviating from perpendicular.

=> Incoming light to be as perpendicular as possible

That's why Leica can't use their existing lenses and expect perfection... as they usually do ;-). They are way too close to the sensor with angles more than 45 degrees from pedicular striking the sensor. B.t.w., the picture shown here is correctly taken from an Olympus "White Paper" to save some time.

There are other specific "Digital Issues" that we have to adress like aliasing etc. Take a look at this link:

http://www.schneiderkreuznach.com/knowhow/digfoto_e.htm

This shows clearly that when designing the lens, you do have to make "digital" considerations...

...so throw away all your (my!) old stinking Nikon & Canon Dinosaur glass!!! ;-))

Cognac
 
Hi Joe,

I always use principle of KISS.
Good basic principle!
It's not making a point to do quantum, wave mechanics, trying to
use Schrodinger equations, or trying to apply string theory to this.

The basical drawings are even at the most simplified level totaly
wrong.

Never used ray tracing, but I can imagin it's like using the old
fashioned Snellius for each ray at every medium,x to medium,y
surface, lot of computation and as I studied I had to use a slide
calculator. ;-)
Having seen plenty of ray drawings in technical papers, I also know this is a very simple and misleading drawing but it seems to satisfy quite a few sheep.
jacques,

no abuse meant.
None taken!
Rinus of Calgary
 
I believe that I was the one answering Rinus question and posting
the lens drawings ... here's the original posting (fixed some
misspellings):

"Rinus: What i am saying is that to be perfect, you have to create
a new generation of lenses anyway. Placing the lens too close to
the film plane will force light to strike the sensor on a great
angle which will create artificial chromatic aberration and
bleeding into neighboring sensing areas ...this is a fact. What we
have to do is to minimize the variation of the angles. In film
emulsion the problem is taken care of by the random distribution of
grains that average-out the effects.

Possible or not...take a look at this picture:

http://www.pbase.com/image/1494688

Get it? To really get the most out of your lens it has to be
taylormade for the sensor (MTF, Anti-aliasing etc) ...but who cares
about perfection ..???

I will not buy a DSL. " (End of original posting)

So ...what I state here is that you have to minimize the
possibility that light strikes under angles greatly deviating from
perpendicular.

=> Incoming light to be as perpendicular as possible

That's why Leica can't use their existing lenses and expect
perfection... as they usually do ;-). They are way too close to
the sensor with angles more than 45 degrees from pedicular striking
the sensor. B.t.w., the picture shown here is correctly taken
from an Olympus "White Paper" to save some time.

There are other specific "Digital Issues" that we have to adress
like aliasing etc. Take a look at this link:

http://www.schneiderkreuznach.com/knowhow/digfoto_e.htm

This shows clearly that when designing the lens, you do have to
make "digital" considerations...
I agree that lenses for digital cameras need a new approach regarding optimizing distance to the recording CCD surface.

I am sure as I said before that it would have been the reason Contax walked away from releasing their full size CCD. The lenses will not be consistent and massive fall-off or chroma or a host of other problems may occur.

Leica as well as anyone could make the digital jump if they kept the CCD to the small 1.5x size as was proven by previous makers.

The Choice of leica and Contax (and maybe others) to not go full size was the risk as I explained before.

The new lens formula may be as simple as fixing the rear nodal point to a fixed ideal distance range (or more) but this could only happen if all parties agree.

From what I have seen, even the 1.5x CCDs are not giving consistent high resolution to a lens when that lens is moved to a different CCD. (Camera)

This new 4/3 format is supposed to standardise those variances but I think they are not going to convince people with the marketing stunts. It will just be flash in the pan even if it is based on real honest and sound optical principles.

The drawing that we have been using is a real misleading piece of info but I knew that the intent was correct.
The new link you provided is an interesting article to read. Thanks,
Rinus of Calgary
...so throw away all your (my!) old stinking Nikon & Canon Dinosaur
glass!!! ;-))

Cognac
 
These are custom chip lines already I thought. Could they form
them on over a curved mold or build them on something pliable then
add the sphere shape? I don?t think they would need much of a curve
Unfortunately not, as far as I know. Chips are made on a
semiconductor substrate, usually silicon, but there are others such
as gallium arsenide. This substrate is crystalline and mineral. The
idea is that you grow a crystal, then cut it into thin wafers, and
then use techniques surprisingly similar to enlarging photos in a
darkroom to "print" and etch the circuits on them using a variety
of materials. I don't know of any pliable semiconductor, and it
would be extremely complicated to cut a concave wafer out of the
crystal. Wafers are rather expensive as it is: somebody quoted a
figure of $1000 per wafer (of course, each of them makes a fair
bunch of circuits): I'm sure that a concave wafer would cost maybe
10...100 times as much... not even considering the actual circuit
printing issues.
Silly idea time.

Crystaline silicon is good for very fine geometry (down to 0.08 micron). But image sensors don't need that kind of resolution, they currently live in teh world from 5 to 8 micron, almost 100 times bigger than the features on a hi end microprocessor (for example).

So, how about amorphous (non crystaline) silicon. It's been used successfully for the huge electronics on LCD displays (up to 500mm diagonal rectangles) and flexible electronics (including flexible solar cells).

It could be grown directly on a "bowl" shaped support structure.

Ciao!

Joe
 

Keyboard shortcuts

Back
Top