Science Fiction: Pentax K0.1D!

Wow- not a bad concept at all! It would be nice to have tiny DSLRs in the future!

Perhaps you'd be interested in posting this on the site below?

--
Mo (Adam)

^The internet's first Pentax-Dedicated Forums!!!^
 
That is nice- but maybe suggest the green-to-blue gradation ring: Pentaxians are friends with Samsu.... (any suggestons?) ones- this is the reality... :-)

Best, JR
 
Thanks for feedback so far!

Yes, it takes K mount lenses.

Here is a picture of the back of the camera:



It doesn't have a screen because pictures can be viewed through the EVF/SLR hybrid viewfinder.
 
What is that little green bar where the flash is supposed to be? Is that thrusters? ;)

Actually, I think you are way off. I would be willing to be that in the next 50 years we will not have shutters or lenses. It will all be handled by the imaging sensor. Then things will really get interesting an all in one unit with 1000X image stabilized zoom. Thanks to anti-gravity station keeping units that allows the camera to hover perfectly still even in a hurricane.

Robert
 
There was no space for the Pentax logo, so Pentax decided to use the recognizable green line instead. In the future people recognize brands by colors, for instance like Coca-Cola is red and IBM is blue, because there is often no space for logos.

In the future zooming take place by changing the curvature of a flexible CCD-chip. Currently all CCDs are flat, and that makes the optics bulky. A CCD-chip that can be bent from flat to semi-circular (either way!) will allow for an infitate variaty of focal lengths. However, the camera pictured is not that far into the future, and CCDs are not flexible yet.
 
I am bummed. I was hoping the green bar was thrusters. So it wouldn't need a tripod hole. It just hovers where you want it. Since anti-gravity suspension hasn't been perfected yet. ;)

Robert
 
Hi Rune!

Very nice design. The handle is on the verge of looking not elegant, but flimsy, though.
There was no space for the Pentax logo,
Oh there is, if you wanted to have a logo.
In the future zooming take place by changing the curvature of a
flexible CCD-chip.
Now, as an engineer, I would love to hear a designer's idea of how this changes field of view.
Currently all CCDs are flat, and that makes the optics bulky.
I doubt you can provide one scientifically relevant source for that conclusion.

Cheers
Jens

--

'Well, 'Zooming with your feet' is usually a stupid thing as zoom rings are designed for hands.' (Me, 2006)
http://www.jensroesner.de/
--=! Condemning proprietary batteries since 1976 !=--
 
Hi Rune!

Not sure if this is a joke or not, so I actually feel a bit bad about laughing.

In case this is not a joke, do you have any idea why the light rays should come in differently for the two sensor shapes?

Cheers
Jens

--

'Well, 'Zooming with your feet' is usually a stupid thing as zoom rings are designed for hands.' (Me, 2006)
http://www.jensroesner.de/
--=! Condemning proprietary batteries since 1976 !=--
 
Hello Mr. Troll. You may have used a mirror lens, and you should therefore be familiar with the concept of curved surfaces and field of view. The first step is to design a CCD that only detects light that comes in from directly above (for instance 89 to 91 degrees). This can already be achieved with the right kind of polarization filters or microlenses. Bending the CCD will now change the field of view, since each single CCD-cell will point in slightly different direction, just like bending the mirror in a mirror lens would change the field of view. It is also worth mentioning that the retina in the back of your eye is not a flat surface, but a curved one. This is why there is very little fisheye effect when watching things in real life. A flat retina would require much more advanced optics to keep straight lines straight.
 
Hi Rune!
Hello Mr. Troll.
It is kind of funny if someone actively participating in this forum for over 15 months (me) is being called a troll by a visitor around here (you). I'm not calling you names, and I'd prefer if you would answer my sarcasm and irony with sarcasm and cynism instead of comparing me with big hairy creatures :)

In any case, if you set out this morning to make my day enjoyable, you succeeded.
filters or microlenses. Bending the CCD will now change the field
of view, since each single CCD-cell will point in slightly
different direction,
That's a neat idea. Any idea where the light should come from that matches the orientation of the sensor?

Talking about curved sensors, it might be helpful for you to have a look at the Horizon 202 and similar swing-lens cameras. And before you think that this cameras is proving your point, think some more. :)
It is also worth mentioning that
the retina in the back of your eye is not a flat surface, but a
curved one. This is why there is very little fisheye effect when
watching things in real life. A flat retina would require much more
advanced optics to keep straight lines straight.
You seem to be mixing up curvature of field, focal length change, focus change and lens design in a rather unpredictable way.

When looking at the eye, you should always remember that one of the world's most powerful post-processors post-processes its data-stream in real-time to give us "vision". And I certainly appreciate that you have a vision for the future.

Cheers
Jens

--

'Well, 'Zooming with your feet' is usually a stupid thing as zoom rings are designed for hands.' (Me, 2006)
http://www.jensroesner.de/
--=! Condemning proprietary batteries since 1976 !=--
 
Sorry for calling you a troll. That was totally out of line.

Curved CCDs have actually been in production for millions of years, and I therefore assume that the idea is a good one. Maybe we currently don't know how to make something like this ourselves yet, but sooner or later some clever engineer will. I have included a picture of one of the latest models below, and guess that also answers your question about where the light will come from.

 
Sorry for calling you a troll. That was totally out of line.
Accepted.
Curved CCDs have actually been in production for millions of years,
In know. And it is a neat idea. But it will not work as you suggested. The reason is that such a curved sensor makes no sense if it sits behind a "normal" lens. The lens does project onto a plane, not a sphere. If you would designa lens that did so, you'd still have the problem that the cuve of the sensor does meet the curve of the image in only one position. So, the sensor would have to be used on its own.

Problem is that each sensor element ("Pixel") would require its own focus mechanism. Or, if you accept a certain curvature of field or focus error, a set of movable micro lenses.

I also think that your 89° idea will not work. With a slight bend outward you'd have no complete field coverage, as the 2° cones of each sensor would not "touch" one another.
Well, I guess you can come up with some ideas for that.
and I therefore assume that the idea is a good one. Maybe we
currently don't know how to make something like this ourselves yet,
but sooner or later some clever engineer will.
It's no problem to do something like that now. It just isn't small: Camera arrays can be built already. In that case, each sensor element would be a camera, with a lens.
I have included a
picture of one of the latest models below, and guess that also
answers your question about where the light will come from.
Not really, unless that bug was wearing glasses in which case I'd be surprised :D Seriously, I guess it is clear now what I meant.

Jens

--

'Well, 'Zooming with your feet' is usually a stupid thing as zoom rings are designed for hands.' (Me, 2006)
http://www.jensroesner.de/
--=! Condemning proprietary batteries since 1976 !=--
 
But it will not work as you
suggested. The reason is that such a curved sensor makes no sense
if it sits behind a "normal" lens. The lens does project onto a
plane, not a sphere. If you would designa lens that did so, you'd
still have the problem that the cuve of the sensor does meet the
curve of the image in only one position. So, the sensor would have
to be used on its own.
True, the whole point is to skip the lens entirely. It could still be covered by a thin clear semisphere of plastic or something though. Maybe I will make a render some time in the future to show what that would look like. Although skipping lenses would be great, you would still be able to use it behind a "normal" lens if you don't use the wide-angle functionality.
Problem is that each sensor element ("Pixel") would require its own
focus mechanism. Or, if you accept a certain curvature of field or
focus error, a set of movable micro lenses.
Not really. A small fixed focus fixed aperture (F10 or above) microlens would do fine.
I also think that your 89° idea will not work. With a slight bend
outward you'd have no complete field coverage, as the 2° cones of
each sensor would not "touch" one another.
Well, I guess you can come up with some ideas for that.
When you use it on extreme tele, all the cones will touch each other (although that may be 100000 km in a straight line away from where you are standing). 90.00 degree would be ideal, and will always give a clear picture. Current CCDs actually have space between the CCD-cells too.
Not really, unless that bug was wearing glasses in which case I'd
be surprised :D Seriously, I guess it is clear now what I meant.
It doesn't need to wear lenses.
 

Keyboard shortcuts

Back
Top