Flat gold & silicon lens developed that is much sharper than glass

Started Aug 30, 2012 | Discussions thread
alanr0
Senior MemberPosts: 1,262
Like?
Re: Coherent sensors: waveform and wavefront sampling
In reply to hjulenissen, Aug 31, 2012

hjulenissen wrote:

alanr0 wrote:

For white light, optical phase and amplitude are insufficient to distinguish change in wavelength from change in direction, so you need additional information. Time-resolved measurements at each point across the wavefront at 0.6 femtosecond intervals might do the trick.

If all that a lense does (ideally) is to delay some parts of the light more than others, one would think that the same can (in principle) be done by recording the wavefront in space/time (using a special image sensor with no lense in front), and combining the elements with different amplitude/phase modification, just like a delay-and-sum array?

'Delay and add' is not the same as 'phase shift and add' when you have an octave of electromagnetic spectrum to handle.

That example would seem to be a direct analogy to microphone/antenna arrays in that the bandwidth is significant (antenna arrays are sometimes considered "monochromatic" like lasers, but microphone arrays tends to be several octaves).

With sound you can record the acoustic field directly. If each microphone measures air pressure at a sample rate of 50 kHz, you collect enough information to reproduce the audible spectrum from 20 to 20000 Hz. Applying a position-dependent delay is straightforward. With enough measurement points, you can figure out where each part of the sound field originated, or synthesize an arbitrary acoustic lens.

For visible light the EM field oscillates at 400 to 750 THz, so we can't sample the electric field directly. Instead we measure the phase offset from a reference (local oscillator) source. For monochromatic light, there is a linear relationship between delay and phase, and we can synthesize a lens by applying an appropriate position-dependent phase shift to the signal.

For broadband light, the phase vs delay slope is different for each wavelength. A 1.5 femtosecond delay corresponds to 360 degree phase shift for 450 nm blue light, but only 216 degrees for 750 nm red light. A phase plate will behave differently at each wavelength.

Is not the wavelength of light directly connected to the energy of each photon? So if that quantity could be measured (again: difficult to do in practice)

Heisenberg's uncertainty principle rears its head for single photon measurements. In the same way that measurements of position and momentum (or direction) are linked, a precise measurement of energy is not compatible with a precise measurement of phase.

For acoustic and radio signals, phonon or photon energies are much lower, and even very small energies correspond to multiple quanta.

Fun to spectulate. Cheers.
--
Alan Robinson

Reply   Reply with quote   Complain
Keyboard shortcuts:
FForum PPrevious NNext WNext unread UUpvote SSubscribe RReply QQuote BBookmark post MMy threads
Color scheme? Blue / Yellow