Is there a way to test for ento/tele/hypercentricity of a system?

BobORama

Senior Member
Messages
2,880
Solutions
5
Reaction score
1,466
Location
Allentown USA, Earth, US
I'm trying to solve a problem with actual lenses, so this is more of a practical than theoretical question. I have a long distance microscope positioned some distance from a subject - inches / feet, etc. I stumbled on the fact that combinations of focuser position and rear extension tube length can result it being ento or hyper centric, so I assume there is a magic position in the middle which is telecentric.

So my question is, is there some way I can visually test for this condition by viewing a special subject / target under the scope as I am futzing? Am I thinking maybe a hollow cylinder - where I could image the inside and outside surface and the face? If hyper centric I would be able to image the outer surface and when entocentric the inner surface? Or is there some supid easy way to do this I am not seeing?


-- Bob
http://bob-o-rama.smugmug.com -- Photos
http://www.vimeo.com/boborama/videos -- Videos
 
what i usually do to adjust things for the telecentric condition is have a grid target mounted on a spring loaded translation stage and also stop down the lenses a lot. then as i adjust things i am watching the size of the grid pattern on the monitor and find the situation where the size of the grid pattern on the monitor stays the same as i use my hand to move the carrier block on the stage. it works pretty fast for me
 
Last edited:
then as i adjust things i am watching the size of the grid pattern on the monitor and find the situation where the size of the grid pattern on the monitor stays the same as i use my hand to move the carrier block on the stage. it works pretty fast for me
I got a little lost with the translation stage bit, so essentially you are moving the target in the Z axis and looking for magnification changes front or back focused? I mean that sounds pretty simple.


-- Bob
http://bob-o-rama.smugmug.com -- Photos
http://www.vimeo.com/boborama/videos -- Videos
 
What about putting a pile of black drinking straws in front of the camera with a bright light behind. When the white circles line up with no black straw tubes, your are good?
 
yes. that is all there is to it. just looking for a configuration where the size of the image does not change when the location of the object is moved along the z axis

the bit about the translation stage is just how i have found is convenient to do things in the lab. when i show this to other engineers they typically start off wanting to turn the micrometer screw on the stage and the change in image size is so slow one can’t judge things. the bit about pulling against the spring by hand is just to make things move fast so easy judgements can be made.

when we make fine adjustments we have software routines to match the grid and give average spacing in pixels.
 
In a perfect world, this would be just shown by software by using already existing OPSDAF points, typically there different rows optimized for different exit pupil locations
 

Keyboard shortcuts

Back
Top