DIGIC V

yes, I very much want to see the logic. Show me the chain of reasoning which says a digital processor (and an algorithm is an algorithm, whether it is executed in a dedicated co-processor or by a program running on a general purpose processor).
What are you babbling about?
Simple, an algorithm is an algorithm, whether it is executed in a dedicated co-processor or by a program running on a general purpose processor
--
Bob
 
yes, I very much want to see the logic. Show me the chain of reasoning which says a digital processor (and an algorithm is an algorithm, whether it is executed in a dedicated co-processor or by a program running on a general purpose processor).
What are you babbling about?
Simple, an algorithm is an algorithm, whether it is executed in a dedicated co-processor or by a program running on a general purpose processor
The only SW programmable ISP I've seen is the one from SiliconHive. All the rest I know are fixed HW with some programmable parameters.

No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
 
No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
Unless, of course, you process multiple output channels from the sensor in parallel.
With four channels, for example, you only have 25mp per second.
 
The only SW programmable ISP I've seen is the one from SiliconHive. All the rest I know are fixed HW with some programmable parameters.

No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
Look, this is really simple ...

There is some form of code (software/firmware call it whatever the hell you like), most likely in assembly language, running on the DSP in your camera. End of story.

You can argue it if you like but it won't make you right.
 
This thread really is useful for me. it made me realise that i should outside talikng photos rather than reading up on this stuff.
--
-----------------
Phil M. - Toronto, Canada

If you have some time to chew on a few of my photos, please go to: http://www.flickriver.com/photos/phil_marion/sets/
 
This thread really is useful for me. it made me realise that i should outside talikng photos rather than reading up on this stuff.
--
-----------------
Phil M. - Toronto, Canada

If you have some time to chew on a few of my photos, please go to: http://www.flickriver.com/photos/phil_marion/sets/
Good for you.

However, most of the people who talk about this stuff have engineering backgrounds. It's okay if you have no clue what we're talking about. Some of us are more than just photographers.
 
The only SW programmable ISP I've seen is the one from SiliconHive. All the rest I know are fixed HW with some programmable parameters.

No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
Look, this is really simple ...

There is some form of code (software/firmware call it whatever the hell you like), most likely in assembly language, running on the DSP in your camera. End of story.

You can argue it if you like but it won't make you right.
Sure. And please show me the general purpose DSP capable to do the fill one pixel processing at speed 100MPixel per second - not to say anything of the JPEG encoding.

I admit I've not followed the DSP development too closely in the recent years, but I was a DSP professional in 1990's.
 
The only SW programmable ISP I've seen is the one from SiliconHive. All the rest I know are fixed HW with some programmable parameters.

No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
Look, this is really simple ...

There is some form of code (software/firmware call it whatever the hell you like), most likely in assembly language, running on the DSP in your camera. End of story.

You can argue it if you like but it won't make you right.
Sure. And please show me the general purpose DSP capable to do the fill one pixel processing at speed 100MPixel per second - not to say anything of the JPEG encoding.

I admit I've not followed the DSP development too closely in the recent years, but I was a DSP professional in 1990's.
It's not my intention at all to call your expertise into question, I'm only disagreeing with your stance that the DSP is simply fixed HW..

With the complexity of Canon's DSLRs, it's not possible to have a DSP that only has fixed registers with programmable parameters.

I'm not saying the DSP is emulated by Software via some VHDL or something, but rather that the contents of those fixed registers on the DSP are being manipulated by handwritten code that is capable of performing the typical add, move, bitwise shift and logical operations on registers.

It's more than fixed hardware, there has to be something driving the automation.

Maybe I don't understand what you're saying but there is a need for the basic assembly language operations (via ADD, MOVE, SHIFTS
 
yes, I very much want to see the logic. Show me the chain of reasoning which says a digital processor (and an algorithm is an algorithm, whether it is executed in a dedicated co-processor or by a program running on a general purpose processor).
What are you babbling about?
Simple, an algorithm is an algorithm, whether it is executed in a dedicated co-processor or by a program running on a general purpose processor
The only SW programmable ISP I've seen is the one from SiliconHive. All the rest I know are fixed HW with some programmable parameters.

No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
Your point is? I never said anything about 'software only algorithms'. It is quite clear from the TI structure diagrams that these things are dripping with hardware accelerators. The point is, in the end, however nifty the hardware accelerator is, it can't do anything a Turing potent machine can't, it might just do it faster.
--
Bob
 
100Mpixels a second should not be a problem. From what I understand the OMAP processor in a few of the Droid phones could come close to simple 100Mpixels/sec.

If you think about it, 1080P @ 30fps is over 60Mpixels/sec. 240kHz TVs with all of the inter field interpolation are doing close to 500Mpixels/sec.

I have no doubt there is programmable logic more than capable of the raw task at hand. The question is being about to do 1000+ pictures on a charge of the battery!

Creating the JPG's is easy, the de-mosiac and color correction, noise reduction and the like are where the hard IP is from what I understand.

What I think would be really interesting to see is adaptive focus spots. With mirrorless cameras I think it would be interesting to place an LCD overlay over the image, the camera could then gray, or something, the areas that are out of focus.So as you manually focus, the camera could show you what is in focus, and what is not.

just saying.
 
For video it really can make a true difference in quality though, if it is fast enough to proper sample, scale and filter the entire image per frame we could see video with much less noise and little to no moire/aliasing/flickering details and other nasties and get much better detail, true 1920x1080p.

Other for stills it is really only the in cam jpgs that could be made much better, RAW is what it is once the sensor and readout electronics have done their thing and there is no magical way to make it better. Although a faster processor can let it push through more fps of course for a given resolution/depth.
The next EOS & PowerShot processor

It has been a long time since we spoke about or saw an updated DIGIC processor in a Canon Camera. It seems DIGIC IV has been around for a long time, 2008 to be exact.

I was told today that DIGIC V has been in development for at least 4 years. A lot has changed during its development because of the explosive success of the 5D Mark II and HD video.

Features of the next DIGIC Processor

■Still and video noise reduction that will be industry revolutionary (I like that word)
■The most throughput on the market
■Dynamic range improvements, unknown to what extent
■New video codec. RAW video is unknown, but the throughput will attainable
■New liveview AF abilities, unknown whether that will affect video AF
■New “creative” tools for new DSLR and compact features

http://www.canonrumors.com/
The DIGIC is simply a processor. Actually simply an ARM processor applications processor with image processing co-processors, most probably one of Texas Instruments Da Vinci series of application processors. Going through your list, those features are software or, if you like, firmware, not features of the processor. It is hard to see how a digital processor can improve dynamic range. It's too late in the imaging chain for that.
--
Bob
I am glad that you point this out. I have also been wondering why people are too concern what processor that a manufacture puts into the camera. Camera is a fixed function computer or a dedicated computer. It either works as design or not. People seem to be excited what kind of process is used.

IMHO, we should be concern on what the final features, performance from a body. Unless, there is away a customer can upgrade the processor yourself or can run other applications on it. Perhaps, the later is coming - a camera that can get to an app. store and download some app. to run. In that case, I think it is reasonable to care about what processor/speed a body has.

Regards.
 
which is where the RAW file is created. The faster/better the processor the higher the IQ can be (depending on the hardware coded functions and software) .

With the introduction of the digic III it was possible to increase the tonal depth from 12 to 14 bit which is a major improvement IMO.

Chris
No, the signal was too much of mess to even be able to use 14bits, the 14bits just wasted space on your hard drive.

Digic does not do the A/D.
 
Sorry to disabuse you. An I5 has oodles more image processing ability than a DIGIC, it has very similar image co-processors only they work about twice as fast and it has more of them. What you are observing is not how long it takes the processor to do image processing calculations, but the IO overhead of a ludicrously inefficient operating system.
I have to disagree with that strongly.
 
Digic is not just any processor, it is a dedicated processor with hardware algorithms, which are much faster than software based solutions.
I think you are making incorrect assumptions. Bob is right that the Digic is just a 'normal' ARM processor and it's hardly any more specialized than the processor used in the iPhone (also ARM).
Nonsense, there is no way that could handle processing images that fast. They also have a custom chip for h.264 since there is no way in the world that little ARM CPU could handle either encoding or decoding 1920x1080p h.264 of that profile level in real time.
 
The iphone has a fancy custom graphics chip (fancy for a low power chip that is) from the UK mixed into their main core and the Xoom has a fancy chip (for low power) from nvidia.

I don't know why a few people, not yourself, seem to think it's just general purpose CPU doing everything.
Digic is not just any processor, it is a dedicated processor with hardware algorithms, which are much faster than software based solutions.
I think you are making incorrect assumptions. Bob is right that the Digic is just a 'normal' ARM processor and it's hardly any more specialized than the processor used in the iPhone (also ARM).
I have not seen ARM core in a standalone chip (like x86 base or its variants). It typically is part of a System On Socket i.e. the whole chip integrates other peripherals that make embedded solution for small form factor device easier/lower cost to manufacture. iPhone's or any Android's smart phones SoC (with ARM) do have LCD controller, A/D, small boot ROM, USB controller and I believe even some image processor included. I suspect DIGIC V is similar that it has other things that are useful for image/video processing support.
For example it takes a modern intel i5 2.5ghz dual core cpu a second ot so to do the most basic NR to a 18 mp raw file, then a few more seconds to convert it to jpeg, while it takes digic4 almost no time to do the whole thing. so "improving imaging processing" in making a chip is completely normal.
And that's because the in-camera algorithms for raw-> jpeg conversion and noise reduction are more primitive than the software based solutions. Hence the lower quality of in-camera jpegs compared to Lightroom, for example.
Regards.
 
It's not my intention at all to call your expertise into question, I'm only disagreeing with your stance that the DSP is simply fixed HW..
I simply can not imagine them using a fixed ASIC, programmable logic would made a lot more sense. The extra test/dev time to go fixed ASIC would be huge!! I have no idea of the volumes, but I just can't imagine it being worth it for fixed hardware.

There are very few places still using custom fix ASIC or custom dies. The scale needs to be on the iPhone/iPAD scale (Apple's A4, now A5 processor). And I don't think Canon sells as many DSLR's as Apple sells iPad's
With the complexity of Canon's DSLRs, it's not possible to have a DSP that only has fixed registers with programmable parameters.
If they can fix imagine problems with updates, its programmable logic.
I'm not saying the DSP is emulated by Software via some VHDL or something, but rather that the contents of those fixed registers on the DSP are being manipulated by handwritten code that is capable of performing the typical add, move, bitwise shift and logical operations on registers.
It is full on VHDL and C, to my best guess/knowledge, it may be an ATOM processor, those things are amazing, the product I am working on uses the absolute bottom end of that line or similar to do 720P or two simultaneous 480i mpeg4 compressions with audio.

Or it could be something like the Cell processor in the PS3. But the roots are programmable logic, FPGA/DSP, with probably an ARM core running L*nux or such.
It's more than fixed hardware, there has to be something driving the automation.

Maybe I don't understand what you're saying but there is a need for the basic assembly language operations (via ADD, MOVE, SHIFTS
Think massively parallel programmable/fixed processing.

I will ask a guy a work, he has co-designed/built a couple of video camera systems if he knows what Nikon/Canon are using for their the processor.

Seriously, probably the biggest problem is power consumption. Booting the e-Linux in a second or two, or keeping the D-ram alive while in standby.
 
canon 5D mark III and the 7D II will get the DIGIC V i think...

do you think the canon 7D II will get 2 of DIGIC V?
Having dual Digics is an unnecessary complication. If I was Canon, I’d design the Digiv V so that just a single one will suffice in the 7DII and the 5DIII.

Only the 1-series is expensive enough to warrant the use of two Digics rather tnan one.
You do realize that these digic chips cost but dollars. Even the cheapest P&S that only cost a few bucks have them.
 
Digic is not just any processor, it is a dedicated processor with hardware algorithms, which are much faster than software based solutions.
I think you are making incorrect assumptions. Bob is right that the Digic is just a 'normal' ARM processor and it's hardly any more specialized than the processor used in the iPhone (also ARM).
Nonsense, there is no way that could handle processing images that fast. They also have a custom chip for h.264 since there is no way in the world that little ARM CPU could handle either encoding or decoding 1920x1080p h.264 of that profile level in real time.
I said it is an ARM applications processor - the term 'applications processor' has come to mean a 'system on a chip'. The TI Davinci, which it probably is, has a whole number of specialised processors for image processing and also a general purpose DSP in addition to the ARM core. The same is true of the iPhone processor, also an applications processor,
--
Bob
 
I will ask a guy a work, he has co-designed/built a couple of video camera systems if he knows what Nikon/Canon are using for their the processor.
I have co-designed/built a couple of video camera systems. I know what Nikon and Canon are using. Canon are using an ARM applications processor. That information comes from CHDK (Canon Hack Development Kit) who develop hacks for canon cameras

http://chdk.wikia.com/wiki/CHDK . CHDK believe it to be the TI series of Davinci applications processors, which match closely the devices they find programmed in the firmware.

http://focus.ti.com/dsp/docs/dspplatformscontenttp.tsp?sectionId=2&familyId=1300&tabId=1854
The Nikon 'EXPEED' is a Fujitsu FR-V, evidenced by the firmware strings.
--
Bob
 
The only SW programmable ISP I've seen is the one from SiliconHive. All the rest I know are fixed HW with some programmable parameters.

No "SW only" algorithm set I've seen is able to process 100+ Mpixels per second (required e.g. 18MP 6fps). In a 1GHz processor you have then some 10 one cycle instructions per pixel, or 20 if you take a dual core.
Look, this is really simple ...

There is some form of code (software/firmware call it whatever the hell you like), most likely in assembly language, running on the DSP in your camera. End of story.

You can argue it if you like but it won't make you right.
Sure. And please show me the general purpose DSP capable to do the fill one pixel processing at speed 100MPixel per second - not to say anything of the JPEG encoding.

I admit I've not followed the DSP development too closely in the recent years, but I was a DSP professional in 1990's.
It's not my intention at all to call your expertise into question, I'm only disagreeing with your stance that the DSP is simply fixed HW..
I've newer said DSP (=digital signal processor) is a fixed HW. I've certainly made enough programs for those.

My statement was and is ISP (=image signal processor) is most typically implemented as relatively fixed HW where you can set parameters for CFA, gamma, etc. But you cannot use this to do almost anyhing else than what it's designed for i.e. process raw data stream captured from the image sensor. Some parts like the scalers can in some architectures used separately as well (assuming the SW system is made to allow the independent use of this).
With the complexity of Canon's DSLRs, it's not possible to have a DSP that only has fixed registers with programmable parameters.
I've not seen any system where the ISP is alone, but usually there is a scalar processor for control and some sort of DSP or vector processor for more complex algortihms. Some of those vector processors I've seen are made to allow more general purpose use, some are made using quite often several special purpose HW accelerators to implement a limited set of algorithms but more efficiently and using less power.
I'm not saying the DSP is emulated by Software via some VHDL or something, but rather that the contents of those fixed registers on the DSP are being manipulated by handwritten code that is capable of performing the typical add, move, bitwise shift and logical operations on registers.

It's more than fixed hardware, there has to be something driving the automation.

Maybe I don't understand what you're saying but there is a need for the basic assembly language operations (via ADD, MOVE, SHIFTS)
The most simple DSP I've seen and used was the NEC uPD7720 which had three (3) instructions, the literal load, the conditional jump and the OP instruction to do all sort of calculations. Not very "general purpose" but fun to use in early 1980's for the first real time digital signal processing implementations I've made.
 

Keyboard shortcuts

Back
Top