So long X-Trans IV, you've been good to us!

So long X-Trans IV, you've been good to us!


  • Total voters
    0
they dropped a camera that had a modern control system, like with the X-S10; and Bayer instead of Xtrans; and esp if they come out with an updated 56/1.2 that focuses fast, quietly, and smoothly. And an updated XF35/1.4 that had the same focus qualities as I mentioned above, plus was sharp across the frame, and not just in the center (I had the X-E1 and several primes when the system started, got frustrated, and moved on to something else, but the things I mentioned above, along with all the vast improvements in the bodies since then, would like get me back; currently all-is with m4/3).
 
New sensors have happened since my first X-E1. The image quality looks sometimes better with more modern sensors. AF is faster and so on ... better cameras are a bit better. Not great news

8K is more like a curse - who really uses it ? A new computer needed and a lot of storage space is needed too ( cards and hard drives) and a 8K TV... I have not seen any serious amateur movies shot with 8K cameras - are there any ?

Will the new sensor give us more dynamic range ? Better IQ ?

I'm not thrilled - more like boring news as usual
If you want better IQ, get better lenses. The camera doesn't really matter much for IQ(aside from noise and dynamic range of course).
That is how I see it - sensor development is OK, but ... it is "normal"

I already have some nice lenses. X-H1 works and it could be great for video, but I do not have time for making 4K video stories. 8K is even more time consuming

X-E3 fits my hands in a good way (ergonomics) - and pockets

IF H2 really has something groundbreaking I can consider of course . Groundbreaking happens only now and then. I jumped from X-T1 to H1 (when it was a bargain) and from X-E1 to E3 .... did not buy every new model.
 
Only a stacked BSI sensor allows to increase the megapix count of the sensor.
What does stacked BSI have to do with MPs?
 
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.

A new sensor with more MP would not entice me to upgrade because more MP can be a two edge sword. I don't care about video so 8 K video is not going to float my boat. However, a new sensor with a base ISO of 64 to 100 and an extra stop of dynamic range at based, at 24 to 26 MP APSC - then I would definitely be interested. X-Trans vs. Bayer - don't really care. Today with Capture 1 and Iridient Developer and others - there is sufficient S/W support to handle X-Trans effectively. Lightroom may are may not be on that list but I don't use Lightroom so don't care. For Fuji X I don't see Fuji backing away from X-Trans. They have to much marketing hype invested to do that.
Truman, why do you always say what I’m thinking! 😄 Agreed on all points.

16mp was enough on my old Pentax, I just wanted to go Mirrorless and the X-H1 was a steal at the price I paid.
 
Why only "bump in megapixels"? There are so many other improvement of sensor is needed.
  1. Lower native base ISO - please follow Nikon Z7 (start from ISO 64).
  2. Quad native base ISO - e.g. ISO 50/400/1600/6400.
  3. Built-in electronic variable (optical) graduated ND filter which powered by AI.
  4. Remove gain amp - force automatic use native base ISO + ND filter (electronic variable or Live ND) for reduce noise. E.g. ISO 1600 + ND = ISO 800.
  5. Electronic variable (optical) graduated ND filter also useful for auto maintain same exposure (via auto mode - adjust to different ND figure) for long exposure photography.
 
Last edited:
Hi, cool poll but just a note- a stacked sensor is not mutually exclusive from X-trans. These are orthogonal technologies.
 
an improvement in noise would be my only request. I suspect my mitakon doesn't serve to benefit much from the bump in megapixels.
 
Why only "bump in megapixels"? There are so many other improvement of sensor is needed.
  1. Lower native base ISO - please follow Nikon Z7 (start from ISO 64).
  2. Quad native base ISO - e.g. ISO 50/400/1600/6400.
  3. Built-in electronic variable (optical) graduated ND filter which powered by AI.
  4. Remove gain amp - force automatic use native base ISO + ND filter (electronic variable or Live ND) for reduce noise. E.g. ISO 1600 + ND = ISO 800.
  5. Electronic variable (optical) graduated ND filter also useful for auto maintain same exposure (via auto mode - adjust to different ND figure) for long exposure photography.
I'll just throw "global shutter" on this wish list

Though I sincerely doubt Sony would give that to them first.
 
I'd stick with X-Trans, back to 24MP but with BSI and stacked sensor with dual gain.

According to rumors, the X-E4 is the last X-mount camera to feature the 26MP X-Trans IV sensor.

It's crazy to think that we're already getting a new sensor. It seems the (in my opinion excellent) X-T3 was launched just 5 min ago.

Either way we're getting a new sensor with the X-H2, which will come out (probably early) next year. So what's it going to be? A stacked sensor? a Bayer CFA? or maybe we'll get two new sensors, in order to further distinguish the different models. Anyways, exciting times are ahead.

For those who wish to participate, here is a small poll:
 
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.
Although it has been in common use, the term "diffraction limited aperture" (DLA) is a bit misleading. Diffraction does not have a hard limit where crossing an aperture threshold will cause your images to fall apart. Diffraction occurs at all apertures with all lenses. The only thing that changes with sensor resolution is the ability of the camera to record that diffraction. Higher resolution/finer pixel pitch sensors do not capture worse images, they just capture things that the lower resolution sensor could not discern.
 
Sadly, the fine gradations of BSI, stacked, X-Trans vs. Bayer et al, are lost on me.

Whatever the sensor in my X-T3 happens to be does a very fine job when I care, or am able, to take a half-way decent picture with it.

As did my X-T1 and X-T20, when I still had them, and as my Nikon D500 and Z5 also still do.

They all print big enough when needed and clients hand over cash when they're happy with the image. I've never, ever had a client complain about the sensor I was using. In fact, I've never, ever had a client even ask me if I was using a Fuji or a Nikon.

At a personal level - and one where I really don't shoot for money - it's about wildlife, birds, and BIF. (Let me say it before anyone else does: I like my wildlife images, but they're really not good enough for anyone to pay anything for!)

Which brings me back to Fuji. Unless I see some serious intent in terms of long, high-quality primes (like my Nikon 300PF and 500PF), along with improvements in autofocus speed, I couldn't care less what sensor appears in the next camera, I have spent my last money on Fuji.
 
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.
Although it has been in common use, the term "diffraction limited aperture" (DLA) is a bit misleading. Diffraction does not have a hard limit where crossing an aperture threshold will cause your images to fall apart. Diffraction occurs at all apertures with all lenses. The only thing that changes with sensor resolution is the ability of the camera to record that diffraction. Higher resolution/finer pixel pitch sensors do not capture worse images, they just capture things that the lower resolution sensor could not discern.
Agreed 100%. You will still get more detail from a higher MP sensor than a lower one even beyond the higher MP sensor's diffraction threshold. We have to evaluate cameras through photos.... not laboratory data charts and formulas

--
Sometimes I take pictures with my gear- https://www.flickr.com/photos/41601371@N00/
 
Last edited:
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.
Although it has been in common use, the term "diffraction limited aperture" (DLA) is a bit misleading. Diffraction does not have a hard limit where crossing an aperture threshold will cause your images to fall apart. Diffraction occurs at all apertures with all lenses. The only thing that changes with sensor resolution is the ability of the camera to record that diffraction. Higher resolution/finer pixel pitch sensors do not capture worse images, they just capture things that the lower resolution sensor could not discern.
Agreed 100%. You will still get more detail from a higher MP sensor than a lower one even beyond the higher MP sensor's diffraction threshold. We have to evaluate cameras through photos.... not laboratory data charts and formulas
If you print it is a different story. It all gets down to the CoC. For 100% on a display then the CoC is much smaller maybe 20% the size than for on a print. There are a lot of reasons. In reality diffraction - above the diffraction limit should be considered equivalent to an anti-aliasing filter with one minor difference. AA filters are designed based on the resolution. The higher the sensor resolution the higher the cutoff spatial frequency of the AA filter. Diffraction is only a function of the aperture. Putting more pixels inside the Airy disk will not improve the resolution. The same point source will just be spread over more pixels. Or two point sources will have overlapping disk independently of the number of pixels after you reach the CoC.
 
As far as mpix, I feel that what I have is enough. I'll always take more exposure latitude, or dynamic range, or whatever it's being called these days.
I would not mind a higher resolution, but my priority is on dynamic range and high-ISO performance!
 
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.
Although it has been in common use, the term "diffraction limited aperture" (DLA) is a bit misleading. Diffraction does not have a hard limit where crossing an aperture threshold will cause your images to fall apart. Diffraction occurs at all apertures with all lenses. The only thing that changes with sensor resolution is the ability of the camera to record that diffraction. Higher resolution/finer pixel pitch sensors do not capture worse images, they just capture things that the lower resolution sensor could not discern.
Agreed 100%. You will still get more detail from a higher MP sensor than a lower one even beyond the higher MP sensor's diffraction threshold. We have to evaluate cameras through photos.... not laboratory data charts and formulas
If you print it is a different story. It all gets down to the CoC. For 100% on a display then the CoC is much smaller maybe 20% the size than for on a print. There are a lot of reasons. In reality diffraction - above the diffraction limit should be considered equivalent to an anti-aliasing filter with one minor difference. AA filters are designed based on the resolution. The higher the sensor resolution the higher the cutoff spatial frequency of the AA filter. Diffraction is only a function of the aperture. Putting more pixels inside the Airy disk will not improve the resolution. The same point source will just be spread over more pixels. Or two point sources will have overlapping disk independently of the number of pixels after you reach the CoC.
I have forgotten my school/univ. mathematics , but I still understand what micrometer is.

I just wonder if I understand this discussion right... A bayer or x-trans sensor is made of red, blue and green "pixels" in a certain array - not like Foveon. One "pixel" in a RAF file is made of several pixels on the sensor if I understand the RGB idea right. Processing has a role too... and a RAW file is just a digital file - there are no physical pixels ... I doubt if limits caused by diffraction can be calculated very precisely from size of just one pixel. Of course I understand that after f11 or 8 the resolution suffers and it can be seen in tests and I think I have seen it in my own images in 200% view. But saying that some filter becomes "diffraction limited" sooner than some other sensor is just useless information - or we can happily forget this. I use different f-values to get a picture I want. I just try to avoid f 16-32 if possible. DLA does not limit me very much - only my skills do that

Of course 40-50mp sensors are not really needed - people do not print huge prints or even small prints nowadays. On a PC or TV screen we can not tell if an image was taken with a 10mp sensor or with a 100mp.... to avoid diffraction limit problems a 5mp sensor could be great. ;-) ;-)
 
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.
Although it has been in common use, the term "diffraction limited aperture" (DLA) is a bit misleading. Diffraction does not have a hard limit where crossing an aperture threshold will cause your images to fall apart. Diffraction occurs at all apertures with all lenses. The only thing that changes with sensor resolution is the ability of the camera to record that diffraction. Higher resolution/finer pixel pitch sensors do not capture worse images, they just capture things that the lower resolution sensor could not discern.
Agreed 100%. You will still get more detail from a higher MP sensor than a lower one even beyond the higher MP sensor's diffraction threshold. We have to evaluate cameras through photos.... not laboratory data charts and formulas
If you print it is a different story. It all gets down to the CoC. For 100% on a display then the CoC is much smaller maybe 20% the size than for on a print. There are a lot of reasons. In reality diffraction - above the diffraction limit should be considered equivalent to an anti-aliasing filter with one minor difference. AA filters are designed based on the resolution. The higher the sensor resolution the higher the cutoff spatial frequency of the AA filter. Diffraction is only a function of the aperture. Putting more pixels inside the Airy disk will not improve the resolution. The same point source will just be spread over more pixels. Or two point sources will have overlapping disk independently of the number of pixels after you reach the CoC.
I have forgotten my school/univ. mathematics , but I still understand what micrometer is.

I just wonder if I understand this discussion right... A bayer or x-trans sensor is made of red, blue and green "pixels" in a certain array - not like Foveon. One "pixel" in a RAF file is made of several pixels on the sensor if I understand the RGB idea right. Processing has a role too... and a RAW file is just a digital file - there are no physical pixels ... I doubt if limits caused by diffraction can be calculated very precisely from size of just one pixel. Of course I understand that after f11 or 8 the resolution suffers and it can be seen in tests and I think I have seen it in my own images in 200% view. But saying that some filter becomes "diffraction limited" sooner than some other sensor is just useless information - or we can happily forget this. I use different f-values to get a picture I want. I just try to avoid f 16-32 if possible. DLA does not limit me very much - only my skills do that

Of course 40-50mp sensors are not really needed - people do not print huge prints or even small prints nowadays. On a PC or TV screen we can not tell if an image was taken with a 10mp sensor or with a 100mp.... to avoid diffraction limit problems a 5mp sensor could be great. ;-) ;-)
Diffraction limit is calculate by the size of the Airy Disk relative to the CoC. The CoC is a concept that is different from a print and a digital display. The CoC is based on the human visual response. However with digital and the ability to map a pixel 1-1 from the sensor capture to the display - the CoC not goes down to the pixel level. At 100% if a point source is spread over N pixels on the sensor it is spread over N pixels on the display. If you are not displaying at 100% then - there is smoothing in generating the display which can act the same as any low pass filter and as such the system now has the equivalent of an AA filter which will raise the CoC.

However, when the Airy Disk takes in multiple detectors, a point source is spread over multiple pixels. When two point sources are apart at a distance less than the diameter of an Airy disk they will merge on the sensor and that limited the resolution of the optical system.

Does that mean one should not use f8 and above? Nope. But if I do when I process I used convolutional sharpening to help mitigate diffraction. You can't eliminate it but you can mitigate the effect. However, more importantly - I don't pay a lot of attention any diffraction on a display since my final print is in fact a print.
 
I want a 40MP X-Trans sensor, and I don't see any reason it can't be a Stacked Sensor.

X-Trans

40MP

Fast Readout

And then a 2:1 Magnification 100mm Macro Lens.

Please and thank you Fuji.
 
Last edited:
I want a 40MP X-Trans sensor, and I don't see any reason it can't be a Stacked Sensor.

X-Trans

40MP

Fast Readout
Not technical problem, just camera maker want better profit margin.

Latest techs are more expensive, it reduce camera maker profit margin.

Latest chip (smartphone processor) are built via 5nm. If the stacked sensor built via 5nm will be extremely fast readout speed, no rolling shutter issue, no longer need mechanical shutter.

APS-C sensor also more easy get faster readout speed than FF.

Anyone know which built process (5/7/10/14nm?) used by Sony 1A stacked sensor?

Currently only Taiwan TSMC and Samsung able manufacturing 5/7nm chip.

Apple M1 processor also built by Taiwan TSMC via 5nm process.
 
Last edited:
One thing to keep in mind is there is a point of marginal returns and even a point of trade offs with more MP on the same size sensor.

The 26 MP APSC is diffraction limited at f8. The Airy Disk diameter at F8 is 10.65 micro meters and the pixel pitch is 3.84 micro meters. A 30 MP sensor would have a pile pitch of 3.58 micro meters and would become diffraction limited at f 6.7. A 40 MP would have a pixel pitch of 3.1 micrometer and become diffraction limited at 6.3. There is no free lunch. Whatever increase in resolution one gets he might give more away because of diffraction if large DOF was required.
Although it has been in common use, the term "diffraction limited aperture" (DLA) is a bit misleading. Diffraction does not have a hard limit where crossing an aperture threshold will cause your images to fall apart. Diffraction occurs at all apertures with all lenses. The only thing that changes with sensor resolution is the ability of the camera to record that diffraction. Higher resolution/finer pixel pitch sensors do not capture worse images, they just capture things that the lower resolution sensor could not discern.
Agreed 100%. You will still get more detail from a higher MP sensor than a lower one even beyond the higher MP sensor's diffraction threshold. We have to evaluate cameras through photos.... not laboratory data charts and formulas
If you print it is a different story. It all gets down to the CoC. For 100% on a display then the CoC is much smaller maybe 20% the size than for on a print. There are a lot of reasons. In reality diffraction - above the diffraction limit should be considered equivalent to an anti-aliasing filter with one minor difference. AA filters are designed based on the resolution. The higher the sensor resolution the higher the cutoff spatial frequency of the AA filter. Diffraction is only a function of the aperture. Putting more pixels inside the Airy disk will not improve the resolution. The same point source will just be spread over more pixels. Or two point sources will have overlapping disk independently of the number of pixels after you reach the CoC.
I have forgotten my school/univ. mathematics , but I still understand what micrometer is.

I just wonder if I understand this discussion right... A bayer or x-trans sensor is made of red, blue and green "pixels" in a certain array - not like Foveon. One "pixel" in a RAF file is made of several pixels on the sensor if I understand the RGB idea right. Processing has a role too... and a RAW file is just a digital file - there are no physical pixels ... I doubt if limits caused by diffraction can be calculated very precisely from size of just one pixel. Of course I understand that after f11 or 8 the resolution suffers and it can be seen in tests and I think I have seen it in my own images in 200% view. But saying that some filter becomes "diffraction limited" sooner than some other sensor is just useless information - or we can happily forget this. I use different f-values to get a picture I want. I just try to avoid f 16-32 if possible. DLA does not limit me very much - only my skills do that

Of course 40-50mp sensors are not really needed - people do not print huge prints or even small prints nowadays. On a PC or TV screen we can not tell if an image was taken with a 10mp sensor or with a 100mp.... to avoid diffraction limit problems a 5mp sensor could be great. ;-) ;-)
Diffraction limit is calculate by the size of the Airy Disk relative to the CoC. The CoC is a concept that is different from a print and a digital display. The CoC is based on the human visual response. However with digital and the ability to map a pixel 1-1 from the sensor capture to the display - the CoC not goes down to the pixel level. At 100% if a point source is spread over N pixels on the sensor it is spread over N pixels on the display. If you are not displaying at 100% then - there is smoothing in generating the display which can act the same as any low pass filter and as such the system now has the equivalent of an AA filter which will raise the CoC.

However, when the Airy Disk takes in multiple detectors, a point source is spread over multiple pixels. When two point sources are apart at a distance less than the diameter of an Airy disk they will merge on the sensor and that limited the resolution of the optical system.

Does that mean one should not use f8 and above? Nope. But if I do when I process I used convolutional sharpening to help mitigate diffraction. You can't eliminate it but you can mitigate the effect. However, more importantly - I don't pay a lot of attention any diffraction on a display since my final print is in fact a print.
Yes

Nice to hear that there are still people who print images ...

I just bought a new printer - the older got too old and after almost 10 years of service t was time to replace it - a new New Epson SC P 900 prints A2 images and the quality has been stunning in some few test images. I think I will totally forget diffraction limitations ( I have never had problems with diffraction ... ) and try to make some decent prints . So should many of us - if possible. A photograph should be a print on paper - not always, but more often.
 

Keyboard shortcuts

Back
Top