Super rez for video, how long till it's reality ?

Is it even possible? Thinking of how stills super rez is created, taking slightly differing samplings from the sensor and combining them. Natural hand motion will provide the sampling variation (like Olympus' HHHR). Im thinking with say 24p, a single "frame" could be 3 shots combined, each at 1/72 sec.

The exposure would be the same, the SNR should be the same from noise averaging, but the resulting rez should be higher. The algorithm would have to be designed similar to newest versions of stills super rez, removing ghosting of things moving quickly through the frame.
Capturing motion is the whole point of video, so no. Also, 24p is a miserable temporal resolution to begin with. Just a bit better than that of the silent movies (yes, I know that it is the "industry standard" and it sucks)
Yeah, except no. Higher frame rate is not "better".
Well, the world has infinite frame rate to begin with...
No, it doesn't. The human brain can discern information at up to 60 fps¹ but really sees at about a max of 40 fps²
The human brain is not the world we observe.
Yes, yes it is. You can only observe the world with your brain.
This is a different statement.

Here: This is you. That is the world. This observes that. Clear now?
No. You think there is another reality besides how you perceive it and this is not correct in regards to visual mediums.
Well, I subscribe to Philosophical realism and do believe that the reality is out there, even if and when I cannot perceive it...

You are off target again. You fail to understand trivial notions.
That you cannot grasp this makes further engagement on this pointless.
:-)
 
It's strange to see somebody make so many arbitrary rulings, and pretending they are some universal truth. You say our eyes can "discern" motion at 40fps, well that remains to be proven medically or scientifically but lets say it is true.
It does not "remain to be proven" It has been tested.
Funny, HERE is a research paper from 2014 that shows our brains can process images at about 75fps. You see this is the problem when you start throwing around the word "tested", tested by who?

On top of that, not everybody has DNA that is identical, some people have reflexes that are far apart, vision varies too. You have not proven anything you have said, just claims.
So why isn't shooting at 40fps better than 24?
The reason that 24fps was settled on is more than cost. We think we see everything sharp and clear. We don't. Our brains blur motion and they throw away information in what we do capture. Throwing more information at the brain, i.e higher frame rate, is not a boon.
Actually it's exactly the opposite, we see blurry things but our brains make it sharp in our mind. It's why video "looks" sharp but if you pause any particular frame, is much softer than when video is in play.
 
It's strange to see somebody make so many arbitrary rulings, and pretending they are some universal truth. You say our eyes can "discern" motion at 40fps, well that remains to be proven medically or scientifically but lets say it is true.
It does not "remain to be proven" It has been tested.
Funny, HERE is a research paper from 2014 that shows our brains can process images at about 75fps. You see this is the problem when you start throwing around the word "tested", tested by who?

On top of that, not everybody has DNA that is identical, some people have reflexes that are far apart, vision varies too. You have not proven anything you have said, just claims.
So why isn't shooting at 40fps better than 24?
The reason that 24fps was settled on is more than cost. We think we see everything sharp and clear. We don't. Our brains blur motion and they throw away information in what we do capture. Throwing more information at the brain, i.e higher frame rate, is not a boon.
Actually it's exactly the opposite, we see blurry things but our brains make it sharp in our mind. It's why video "looks" sharp but if you pause any particular frame, is much softer than when video is in play.
BTW, I just created a MATLAB script for a somewhat different purpose - to test my perception of photon noise depending on the fps. I can clearly see the difference between 30 fps and 60 fps. Not to mention that there is a huge difference in real life shooting between the two but then one can always blame the shutter speed for that.

I cannot test 40fps reliably because my monitor does 60 fps.
 
BTW, I just created a MATLAB script for a somewhat different purpose - to test my perception of photon noise depending on the fps. I can clearly see the difference between 30 fps and 60 fps.
That conflicts with something you said earlier:
I have a device which shoots at 60 fps and and 30 fps. No difference in the noise. The reason is - it displays the 60 fps footage at, well, 60fps, and the noise averages but I said that already.
What do you think accounts for the two different observations?
Not to mention that there is a huge difference in real life shooting between the two but then one can always blame the shutter speed for that.
So, have you revised the theory regarding noise averaging and lighting with high frame rates?
 
Last edited:
Is it even possible? Thinking of how stills super rez is created, taking slightly differing samplings from the sensor and combining them. Natural hand motion will provide the sampling variation (like Olympus' HHHR). Im thinking with say 24p, a single "frame" could be 3 shots combined, each at 1/72 sec.

The exposure would be the same, the SNR should be the same from noise averaging, but the resulting rez should be higher. The algorithm would have to be designed similar to newest versions of stills super rez, removing ghosting of things moving quickly through the frame.

Could this be done? Any thoughts? Is this already an idea somewhere?
LOL, it would require a super computer and I'm not even sure that would do the job so don't hold your breath!!!!!
 
BTW, I just created a MATLAB script for a somewhat different purpose - to test my perception of photon noise depending on the fps. I can clearly see the difference between 30 fps and 60 fps.
That conflicts with something you said earlier:
I have a device which shoots at 60 fps and and 30 fps. No difference in the noise. The reason is - it displays the 60 fps footage at, well, 60fps, and the noise averages but I said that already.
What do you think accounts for the two different observations?
I see different dynamics - noise changing faster with 60fps but it does not look more or less noisy. When you pause, it does, as it should.

Here are the clips. Do NOT play them in your browser! The compression will destroy the noise. Download them and play them from your computer. It is tricky to play two videos side by side in WIN at least, you can use two different players or some more sophisticated one (VLC should work). Also, 30fps is played by doubling each frame, I guess.
Not to mention that there is a huge difference in real life shooting between the two but then one can always blame the shutter speed for that.
So, have you revised the theory regarding noise averaging and lighting with high frame rates?
I wish I had, that would have shaken statistics as we know it and would have made me a celebrity in some circles.
 
... lighting become a serious problem with high frame rates.
Not really. This is a version of the large pixels myth but in the time variable.
Not sure I even understand what you are trying to say here.
Sampling the scene more frequently does not increase noise; it increases temporal resolution. Similarly to the fact that smaller pixels do not increase noise just because they are smaller.
The lighting issue with high frame rates is not about pixel size. It's a matter of exposure time.

Video shot at 60fps can have each frame exposed to light for no more than 1/60s.

Video shot at 1000fps can have each frame exposed to light for no more than 1/1000s.

That's what makes lighting a problem as frame rates keep increasing.
The leaps and bounds of High ISO capabilities in recent years more than makes up for the increase in shutter speed.

No one is talking about 1000fps video right now, but if everyone starts shooting at 1/120s it will only benefit the movie industry as a whole.

3D already incurred a huge lighting penalty anyway, most Hollywood studios should already have everything they need.
 
It's strange to see somebody make so many arbitrary rulings, and pretending they are some universal truth. You say our eyes can "discern" motion at 40fps, well that remains to be proven medically or scientifically but lets say it is true.
It does not "remain to be proven" It has been tested.
Funny, HERE is a research paper from 2014 that shows our brains can process images at about 75fps. You see this is the problem when you start throwing around the word "tested", tested by who?
If you were paying attention, here I linked to an article that references that very study.

There is a difference between discerning information and discerning motion.
On top of that, not everybody has DNA that is identical, some people have reflexes that are far apart, vision varies too. You have not proven anything you have said, just claims.
There are variation in DNA, this is true. But they are within a relatively tiny range and there is a limit to improvement.
So why isn't shooting at 40fps better than 24?
The reason that 24fps was settled on is more than cost. We think we see everything sharp and clear. We don't. Our brains blur motion and they throw away information in what we do capture. Throwing more information at the brain, i.e higher frame rate, is not a boon.
Actually it's exactly the opposite, we see blurry things but our brains make it sharp in our mind.
What? OK, I didn't say it perfectly. Our brains don't actually blur the motion, but they are not capable of registering close up motion in anything but a blur. Our brains don't make that motion look sharp, they just ignore the blur.
 
... lighting become a serious problem with high frame rates.
Not really. This is a version of the large pixels myth but in the time variable.
Not sure I even understand what you are trying to say here.
Sampling the scene more frequently does not increase noise; it increases temporal resolution. Similarly to the fact that smaller pixels do not increase noise just because they are smaller.
The lighting issue with high frame rates is not about pixel size. It's a matter of exposure time.

Video shot at 60fps can have each frame exposed to light for no more than 1/60s.

Video shot at 1000fps can have each frame exposed to light for no more than 1/1000s.

That's what makes lighting a problem as frame rates keep increasing.
The leaps and bounds of High ISO capabilities in recent years more than makes up for the increase in shutter speed.
Depends on how far you leap.
No one is talking about 1000fps video right now,
I'm someone, and I've been talking about it - but probably not in the way you assumed.
but if everyone starts shooting at 1/120s it will only benefit the movie industry as a whole.
I wasn't talking about the movie industry, but I guess you are.
3D already incurred a huge lighting penalty anyway, most Hollywood studios should already have everything they need.
I wasn't talking about Hollywood, but I guess you are.

Since we're talking about different subjects, and since I've addressed my subject in more detail with someone else here already, we probably don't need to talk at all.
 
Last edited:
Funny, HERE is a research paper from 2014 that shows our brains can process images at about 75fps. You see this is the problem when you start throwing around the word "tested", tested by who?
If you were paying attention,here I linked to an article that references that very study.

There is a difference between discerning information and discerning motion.
On top of that, not everybody has DNA that is identical, some people have reflexes that are far apart, vision varies too. You have not proven anything you have said, just claims.
There are variation in DNA, this is true. But they are within a relatively tiny range and there is a limit to improvement.
Actually it's exactly the opposite, we see blurry things but our brains make it sharp in our mind.
What? OK, I didn't say it perfectly. Our brains don't actually blur the motion, but they are not capable of registering close up motion in anything but a blur. Our brains don't make that motion look sharp, they just ignore the blur.
Ok sounds like you just like playing semantics games. Discerning "information" and "motion". There is information whether something is moving or not, our brain is constantly processing info, if you are moving even the slightest bit your vision isn't static.

And our brains don't make it look sharp, just ignore the blur. Ok? Sounds like when it ignores blur, looking sharper is the only outcome.
 
Funny, HERE is a research paper from 2014 that shows our brains can process images at about 75fps. You see this is the problem when you start throwing around the word "tested", tested by who?
If you were paying attention,here I linked to an article that references that very study.

There is a difference between discerning information and discerning motion.
On top of that, not everybody has DNA that is identical, some people have reflexes that are far apart, vision varies too. You have not proven anything you have said, just claims.
There are variation in DNA, this is true. But they are within a relatively tiny range and there is a limit to improvement.
Actually it's exactly the opposite, we see blurry things but our brains make it sharp in our mind.
What? OK, I didn't say it perfectly. Our brains don't actually blur the motion, but they are not capable of registering close up motion in anything but a blur. Our brains don't make that motion look sharp, they just ignore the blur.
Ok sounds like you just like playing semantics games.
Not semantic at all. Actually read the article I linked.
Discerning "information" and "motion". There is information whether something is moving or not, our brain is constantly processing info, if you are moving even the slightest bit your vision isn't static.
For a forum dedicated to a visual medium, not many people understand the visual part very well. I don't feel creating a seminar on how the brain processes information and the link I provided gives a good enough basic illustration of this part of it.
And our brains don't make it look sharp, just ignore the blur. Ok? Sounds like when it ignores blur, looking sharper is the only outcome.
No. Wave your hand in front of your face and, if you pay attention, you notice the blur. Most of the time when people do this, they just register that the hand has moved, they don't really think about whether it was sharp. In other words, the brain doesn't make the motion appear sharper, it ignores sharpness completely.
 
... lighting become a serious problem with high frame rates.
Not really. This is a version of the large pixels myth but in the time variable.
Not sure I even understand what you are trying to say here.
Sampling the scene more frequently does not increase noise; it increases temporal resolution. Similarly to the fact that smaller pixels do not increase noise just because they are smaller.
The lighting issue with high frame rates is not about pixel size. It's a matter of exposure time.

Video shot at 60fps can have each frame exposed to light for no more than 1/60s.

Video shot at 1000fps can have each frame exposed to light for no more than 1/1000s.

That's what makes lighting a problem as frame rates keep increasing.
The leaps and bounds of High ISO capabilities in recent years more than makes up for the increase in shutter speed.
Depends on how far you leap.
No one is talking about 1000fps video right now,
I'm someone, and I've been talking about it - but probably not in the way you assumed.
but if everyone starts shooting at 1/120s it will only benefit the movie industry as a whole.
I wasn't talking about the movie industry, but I guess you are.
3D already incurred a huge lighting penalty anyway, most Hollywood studios should already have everything they need.
I wasn't talking about Hollywood, but I guess you are.

Since we're talking about different subjects, and since I've addressed my subject in more detail with someone else here already, we probably don't need to talk at all.
The conversation you were not apart of and then quoted was full of rhetoric that does apply to big budget movies.
 
I wasn't talking about Hollywood, but I guess you are.
The conversation you were not a part of and then quoted was full of rhetoric that does apply to big budget movies.
Didn't he say that? Yes he did and therefore your reply was unnecessary.
A note on why big budget productions/Hollywood is relevant to this conversation.

Whilst budget is certainly a consideration, and there are eccentricities, much of what is done with purpose. To serve the telling of the story, to get the film making out of the way. Cinema is also the test bed for experimentation because the audience provides a broader feedback

Given that cinema is the gold standard, technique and equipment will tend to work within professional parameters.

Individuals can want/prefer whatever, but equipment manufacturers are not spending money to chase every vagrant whim.
 

Keyboard shortcuts

Back
Top