It's an evolution, not a revolution. It shows a maturing of the sector, like smartphones etc.
Additional features very much depend on competition. Is their a superior camera ? That's why the A6000 hung around so long, they nailed it from the get go. Next model comes with a touch screen and other bits and pieces. Model after that with IBIS perhaps etc Unless another manufacturer steps up to the plate and releases something evolutionary. Canon/Nikon seem stuck in a rut. Fuji, Olympus doing some nice stuff as is Sony.
Here's a copy/paste of a post I made in the SAR comments section:
Ah, found the original Chipworks report. It talks about the D800 (and also original A7R sensor) using 180nm process -
https://www.chipworks.com/abou...
This is the important one - that's a cross-section of the D800/A7R image sensor.
The authoritative information platform for the semiconductor industry. Learn why TechInsights is the most trusted source of actionable, in-depth intelligence to the semiconductor industry.
www.chipworks.com
In that image, the sites are 320 pixels wide - measuring from wiring center to wiring center. So 320 pixels, 4750 nm per site = 14.84 nm/pixel.
Each wiring area appears to be 120 pixels wide (I'm measuring 116 for one and 126 for the other, so 120 average is close enough), and half of each wiring area is eating from the photosite. So of 320 pixels in width (4750nm), 120 (1781nm) are eaten by wiring. That's almost 40% alone.
What we don't know is if the wiring is only consuming area in one dimension or two. So best case the A7R is losing 40% of its area to wiring, worst case (2 dimensions) it's losing 60% of its area to wiring.
Let's assume the A6000 is on the same manufacturing process. So we'll lose the same amount of sensor area per site (1781nm) to wiring. But our sites are only 3900 nm apart. (I calculated 3900, but everyone else says 3800... I'll use 3800 for now.)
So if the A6000 is on the same 180nm process, we are losing 1781nm to wiring out of 3900 per site. So 46% lost for best-case (1D) and almost 70% if that loss is 2-d. (this seems to be TOO much... indicating that without a process update, 24MP APS-C was probably already pushing things too far). In terms of usable area percentage that's somewhere between 30 and 54%
So if Sony moves to 45nm process (assuming that Renesas fab they bought is in play), and the wiring reduces proportional to the process dimensions, they go down to 1781*(45/180) loss per site = 445nm per site lost. For our 1-D best case, this means 11% area lost, for 2-D it's 22% lost. That translates to between 78 and 89% area usage efficiency.
So from 30% to 78% - a bit more than a stop of gain. From 54 to 89% a bit less.
Also, that region above the photodiode is probably losing some light, this region is going to be thinner, likely on the order of 1/4 as thick as it was before.
So a full stop increase in signal to the sensor is feasible if the former Renesas fab that was making Nintendo Wii U chips is being used to make the A6300 sensor.
Also means the best improvement BSI would provide beyond this would be eliminating the last 10-20% of area wastage.
Of course, this all assumes that they're using the former Renesas 45nm facility at its full 45nm capability. They could be somewhere in between - but it's clear from Chipworks' dissection of the A7R/D8000 sensor that a process shrink below 180nm could result in some significant gains, and the benefit of Sony being so far behind the CPU industry means that if they can indeed repurpose a facility that made CPUs/GPUs to making sensors in 2 years, they're able to do a massive technology leap and still be years behind the CPU/GPU industry. (Imaging sensors are much larger than CPUs/GPUs and as a result are going to have more yield challenges - so it makes sense the imaging industry stays well behind the CPU/GPU industry in terms of process technology.)
An interesting aspect, again assuming that it's that Renesas facility - they purchased a facility using an SOI process. It might not be possible for this facility to manufacture BSI sensors. Who knows how SOI substrates would interact with BSI... I'm assuming they wouldn't play nice together.
------------------------------------------------
Note that my comments regarding that Renesas facility are based on some other things I posted, backed by:
http://www.kitguru.net/components/g...may-leave-nintendo-wii-u-without-edram-cache/ - Indicates that a facility Sony bought for sensor production was previously making the Latte GPU for the Wii U. The CPU is made at IBM Fishkill on 45nm copper SOI, the GPU is apparently made locally on an identical or at least very similar process - 45nm copper SOI
http://image-sensors-world.blogspot.com/2014/01/sony-buys-12-inch-renesas-fab-to-expand.html (mentions 40-45nm process)
A transition from 180nm to 45nm manufacturing process would indeed be revolutionary for high-density sensors. This seems like an extreme leap, but keep in mind Canon was in the process of moving from 500nm (all FF sensors up to and including 5dIII) to 180nm (some of their P&S sensors in 2012) - We don't have any more recent info than 2012 that I can easily find for Canon, but it also indicates the imaging industry doesn't do process changes often (Canon was on 500nm for FF for a decade) - likely only transitioning when it becomes highly beneficial. 24MP was right on the edge where BSI was beneficial for APS-C, but BSI is expensive and it may have been less expensive to give that "new" 45nm fab a shakedown with its first APS-C sensor. As outlined above - 45nm would give most of the improvements of BSI without actually using BSI for a sensor of the A6300's density.