ShortestPath

Joined on Oct 28, 2015

Comments

Total: 23, showing: 1 – 20
« First‹ Previous12Next ›Last »

PSA: Adobe's GPU FAQ mention that "Your system may automatically support basic or full acceleration. If your system automatically supports basic acceleration, you can enable full acceleration using the Custom option."

That was the case for my machine (Classic Mac Pro w/Vega 56), only basic acceleration was enabled after the update. Turning on full acceleration made a world of difference, now scrolling through the library is almost smooth, and adjustments are processed much faster. It used to be the case that selecting a bunch of images and giving them +1 of exposure would result in the images being slowly updated one by one. Now they update much faster and *all at the same time*.

Almost forgot: on Mojave it definitely uses Metal, not OpenGL.

Link | Posted on Aug 14, 2019 at 00:43 UTC as 40th comment
On article Quick look: Canon's new compressed Raw format (234 comments in total)
In reply to:

(unknown member): For some people, any kind of lossy compression isn't worth using. It's a nice option to have, but storage space is pretty cheap. It's not usually a big deal to just carry additional SD cards. You can compress later on your computer, as appropriate. Personally I wouldn't risk using this feature; sure the loss of quality is impressively small, but why take the hit at all?

That said, it's a nice feature for those who need it. I doubt I'd have a use for it myself, but it harms nothing to have the option.

If the compression were lossless, there would be no difference at all in image quality, so lossy compression is a fair assumption.

The article confirms this when it points out that what they saw in the pushed compressed files “is reminiscent of artifacts left behind from noise reduction algorithms that we've seen in the past“.

Link | Posted on Apr 2, 2018 at 15:06 UTC
On article New Sony a9 firmware fixes overheating warning (84 comments in total)
In reply to:

nbeiii: Just tried to get the firmware update. "The update is available, but the site is down for maintenance." Try back later. An unknown "later".

Maybe the website is overheating too.

Link | Posted on Jun 7, 2017 at 06:05 UTC
On article Canon will add C-Log to the EOS 5D Mark IV for $99 (439 comments in total)
In reply to:

Fujica: shameful considering other manufacturers give you firmware updates for free.

There is no hardware change needed considering the fact that it is just a codec. Canon just wants you to believe... Horrifying to see how Canon keeps treating their customers.

The fee is for the included "hardware upgrade", which is likely going to be aimed at improving the cooling. No need for righteous indignation.

Link | Posted on Apr 20, 2017 at 13:31 UTC
In reply to:

DavidsfotosDotCom: Anyone who has owned a 5D mk 2, 3 or 4 & this or any Quattro have experience shooting sunrise / sets with clouds? I need more dynamic range & less color noise at iso 1600 - 3200. Current MF's not a solution!
Please advise.

How about ISO 100 and a tripod? That will buy you 4/5 stops of DR.

Link | Posted on Apr 4, 2017 at 19:51 UTC
In reply to:

noirdesir: Looking at the lens speeds I wonder if XCD lenses are a bit restricted in regard to maximum f-stop by the lens shutter.

IMHO they are just trying to keep size and weight down for their light MF system.

Link | Posted on Mar 1, 2017 at 05:35 UTC
In reply to:

SimenO1: The 16-35: What year does Canon believe it is? 2006? They made it significantly bigger and heavier then the predecessor and they didn't even manage to get IS in there?

The 24-105: What year does Canon believe it is? 2011? They made it significantly bigger and heavier then the predecessor and the stabilizer is just 4 stops and 2 axes.

Both are ridiculous products compared to Pentax 15-30mm f/2,8 and 28-105mm f/3,5-5,6 mounted on the K-1. Both combinations give 5 stops 5 axis stabilization. This is the standard Canon should be aiming at in 2016.

High resolution a creates demand for stabilization even for wide angle. Yes, you might choose larger aperture in stead, but not all want to shell out the money, or deal with the shorter DoF. Stabilization are very useful regardless of larger aperture lenses.

There should be a DPR-specific variant of Godwin's law regarding Pentax...

Link | Posted on Aug 28, 2016 at 14:32 UTC
In reply to:

LEGACYMOMENTSPHOTOGRAPHY: DOES IT SHOOT 4K........................................? LOL!

You should do something about that keyboard of yours.

Link | Posted on May 21, 2016 at 00:06 UTC
In reply to:

Morpho Hunter: Well (presumably) this camera is already "dead in the water" after it's been rumoured that Olympus's EM-1 mark II camera (with its comparatively tiny, super high quality lenses) will produce 80 megapixel hand-held images .. for a fraction of the cost of the Hassy body and lenses..... but hey .. it could
all be a scam!! The era of full frame and medium format cameras is about to end ...

You can't be serious.

Link | Posted on May 21, 2016 at 00:04 UTC
In reply to:

forpetessake: Good idea and another point of failure.

I bet the OP used to voice the same complaint about AF a while back ("Lenses with a *motor* in them? That's recipe for DISASTER!").

Link | Posted on May 12, 2016 at 21:43 UTC
In reply to:

pulsar123: Why so much testing/discussing of jpeg quality - does anyone actually shots jpeg on 1DX?! This would be appropriate for the Rebel line, but not for the top tier line of Canon DSLRs.

Think pro sport photography or photojournalism, where timing *matters*.

Link | Posted on May 4, 2016 at 14:07 UTC
In reply to:

Retro1976: Thanks Adobe. As a software engineer I understand these things happen. Overall I am very impressed with the CC on my Mac.

As a software engineer, I can tell you that shipping code with a bug like this is completely unacceptable, especially on a non-time-sensitive update. Take your time to do proper testing Adobe!!!

It only makes more painfully obvious that Adobe lags way behind companies like Google and Apple in terms of coding practices (not to mention UI design): they are pretty much on par with Microsoft in this regard.

Link | Posted on Feb 16, 2016 at 00:06 UTC
In reply to:

garyknrd: What a bunch of nonsense.

What is nonsense exactly? That lenses exhibit copy variation? That said variation can be reduced only through careful design, QA and feedback from those who repair such lenses? Or that making "perfect" lenses on a large scale and at a reasonable price is not really feasible? Lens copy variation is a well-established fact, the rest all seem well-argued points to me.

Link | Posted on Feb 12, 2016 at 07:50 UTC
In reply to:

tecnoworld: At 16fps, it beats the samsung nx1 by 1 frame, but nx1 can do 15fps with AF and exposure, while this can do 14fps with those features on. Yet nx1 has 8MP pixels more, hence has to have faster readout and processing speed.

Curious to see the 4k compared with that from NX1.

The 1D X II indeed can do 16fps in live view with both metering and AI SERVO thanks to the dual pixel sensor, which is pretty cool.

Link | Posted on Feb 2, 2016 at 23:17 UTC
On article Nikon 24-70mm F2.8 ED VR real-world sample gallery (111 comments in total)
In reply to:

HowaboutRAW: Some much higher ISO samples, raw, in difficult lighting would be nice too.

For example in bad low indoor fluorescent lighting at say ISO 10,000 with the D810, that's a good challenge.

At such high ISO the noise would eat away a lot of the sharpness and color would be worse, making it pretty hard to tell a great lens from a not-so-good one.

It'd be mostly a test of the camera's capabilities rather than the lens'.

Link | Posted on Jan 15, 2016 at 13:56 UTC
On article Nikon's New D5 and D500 Push the Boundaries of DSLR (717 comments in total)
In reply to:

ShortestPath: Rishi, the buffer of the D500 is not 200 shots (slide 1): Nikon is quoting the maximum number of continuous shots that can be taken, and this number is very high if using a fast XQD card.

The buffer capacity, however, is the number shown in the viewfinder before shooting, which is the size of its memory divided by the size of a RAW file. This number is not absolute (it changes with the ISO setting), but even in the best case scenario we are talking an order of magnitude of difference with the number you are quoting: see http://www.dpreview.com/forums/post/57053571.

Bottom line: you should call Nikon out on their marketing shenanigans, the revolutionary thing here is the XQD interface, not the size of the buffer.

To have something to compare it with, the same exercise for the 7d2 (but with certainty on the file size, since I own the camera) gave me a an estimated buffer size of 484MB.

These numbers are very comparable, and hint strongly to the fact that the great maximum burst performance of the D500 is to be attributed entirely to the fast XQD/SD interfaces, and not to a particularly large buffer. It is also worth noting that 468/30 = 15.6, not far from the number that Horshack saw in the D500 viewfinder (14). The D5 instead reported 99 shots, which would suggest that its buffer has indeed grown considerably in size.

As a final point, a bit of sensitivity analysis, since we made an assumption on the files sizes: the same calculations with 25MB and 35MB files yield respectively a buffer size of 98MB (unlikely) and 838MB, which is still not that for from the one of the 7d2.

Link | Posted on Jan 11, 2016 at 03:51 UTC
On article Nikon's New D5 and D500 Push the Boundaries of DSLR (717 comments in total)
In reply to:

ShortestPath: Rishi, the buffer of the D500 is not 200 shots (slide 1): Nikon is quoting the maximum number of continuous shots that can be taken, and this number is very high if using a fast XQD card.

The buffer capacity, however, is the number shown in the viewfinder before shooting, which is the size of its memory divided by the size of a RAW file. This number is not absolute (it changes with the ISO setting), but even in the best case scenario we are talking an order of magnitude of difference with the number you are quoting: see http://www.dpreview.com/forums/post/57053571.

Bottom line: you should call Nikon out on their marketing shenanigans, the revolutionary thing here is the XQD interface, not the size of the buffer.

Why is this difference important? First because we should call things for what they are :), but also because mixing one with the other can be confusing, and can lead to the wrong answer when we ask “why is this camera better?”. I’ve seen several people understandably draw the conclusion that the D500 must have a lot of memory in its buffer. But how big is it really? Let’s do some back-of-the-envelope calculations.

Horshack mentions that the camera slowed down after 74 shots, indicating that the buffer was full, with a 260MB/s SD card. A quick Google search reveals that the one card with that reading speed is a Toshiba, and its maximum write speed is 240MB/s. The other variable we need is the size of a RAW file, and we don’t have it: let’s make an educated guess based on 7d2 files and say that it’s 30MB. So we have:

Input: 30 * 74 = 2,220 MB
Shooting duration: 74 shots @10 fps = 7.4s
Data written w/shooting: 7.4s * 240MB/s = 1,752MB
Estimated buffer size: 2,220 - 1,752 = 468MB

Link | Posted on Jan 11, 2016 at 03:51 UTC
On article Nikon's New D5 and D500 Push the Boundaries of DSLR (717 comments in total)
In reply to:

ShortestPath: Rishi, the buffer of the D500 is not 200 shots (slide 1): Nikon is quoting the maximum number of continuous shots that can be taken, and this number is very high if using a fast XQD card.

The buffer capacity, however, is the number shown in the viewfinder before shooting, which is the size of its memory divided by the size of a RAW file. This number is not absolute (it changes with the ISO setting), but even in the best case scenario we are talking an order of magnitude of difference with the number you are quoting: see http://www.dpreview.com/forums/post/57053571.

Bottom line: you should call Nikon out on their marketing shenanigans, the revolutionary thing here is the XQD interface, not the size of the buffer.

I’ll try to make my point clearer, but first let me elaborate a little on the concepts of “buffer size” and “maximum burst”.

The former, as I mentioned in my previous post, is the size of the memory in the camera that has the job of storing the pictures while they wait to be written to the card(s), measured in number of shots (since the manufacturers are not kind enough to provide the actual size in MB). This number is the one you see - estimated - in the viewfinder while shooting, and it is conservative, since it does not take into account that the camera will start writing to the card as soon as you start shooting, therefore making space for more shots. The latter does not need an explanation, and it is by definition greater or equal than the buffer depth.

Link | Posted on Jan 11, 2016 at 03:50 UTC
On article Nikon's New D5 and D500 Push the Boundaries of DSLR (717 comments in total)
In reply to:

ShortestPath: Rishi, the buffer of the D500 is not 200 shots (slide 1): Nikon is quoting the maximum number of continuous shots that can be taken, and this number is very high if using a fast XQD card.

The buffer capacity, however, is the number shown in the viewfinder before shooting, which is the size of its memory divided by the size of a RAW file. This number is not absolute (it changes with the ISO setting), but even in the best case scenario we are talking an order of magnitude of difference with the number you are quoting: see http://www.dpreview.com/forums/post/57053571.

Bottom line: you should call Nikon out on their marketing shenanigans, the revolutionary thing here is the XQD interface, not the size of the buffer.

You're welcome, glad to hear that you're on top of it (as usual)!

Link | Posted on Jan 9, 2016 at 05:30 UTC
On article Nikon's New D5 and D500 Push the Boundaries of DSLR (717 comments in total)

Rishi, the buffer of the D500 is not 200 shots (slide 1): Nikon is quoting the maximum number of continuous shots that can be taken, and this number is very high if using a fast XQD card.

The buffer capacity, however, is the number shown in the viewfinder before shooting, which is the size of its memory divided by the size of a RAW file. This number is not absolute (it changes with the ISO setting), but even in the best case scenario we are talking an order of magnitude of difference with the number you are quoting: see http://www.dpreview.com/forums/post/57053571.

Bottom line: you should call Nikon out on their marketing shenanigans, the revolutionary thing here is the XQD interface, not the size of the buffer.

Link | Posted on Jan 8, 2016 at 23:23 UTC as 51st comment | 8 replies
Total: 23, showing: 1 – 20
« First‹ Previous12Next ›Last »