melgross

melgross

Joined on Oct 6, 2010

Comments

Total: 723, showing: 1 – 20
« First‹ Previous12345Next ›Last »
In reply to:

tvstaff: I guess there will always be people who let the camera tell them what to do. I tell my camera what to do.

The camera might be able to follow a ball, or a face.... but that's not what we want most of the time. I'm using an OVF to setup my next shot looking a 100 things. 99 of which I have no desire to focus on or take a shot of.

For many, I'm sure this will be helpful.

For me, single point, single point expanded in an "emergency" use a zone. But filling my frame and composing based on what I want and adjusting my DOF as I see fit. With AISERVO Set to Case 4 and First Shot Focus at +2 and Second Shots at +2 Focus at 15fps you are getting exactly what you want without the camera interfering.

If you are a professional sports photographer, you intimately know your sport, your athletes and even the venues you shoot at. I don't see how my camera can trump my knowledge and skill of my equipment and my sport and have a better sense of what I want to shoot vs. me

What can we say to you? You’re “special”.

Link | Posted on Sep 23, 2021 at 15:39 UTC
In reply to:

seragram: Canon is creating separation from competition with leading technology. Not having this tech will be seen as a CON.

Potoughto, as we continue to get newer technologies, we find that we have a greater percentage of good shots. Back in 1969, when I was just 19, and beginning in fashion photography, we had the 3 fps Nikon F with motor drive. It was a great benefit, but you couldn’t rely on it for a decisive moment. It was far too slow for that, but it did help.

At about the same time, the first behind the lens meters were arriving. Also helpful, but not as good as they could have been.

Then autofocus, which was also iffy for a time.

Going back to the 1950s, automatic diaphragms were invented.

All of these technologies took some time to mature, but they did.

And all of that made things easier, and faster. Are you going to tell us that you don’t use any of them?

Link | Posted on Sep 23, 2021 at 15:37 UTC

This is not a great display. It’s no competitor to Apple’s Pro display. This is 4K, apple’s is 6K. Apple’s gets to 1,600 nits, and can maintain over 1,000, possibly 1,200. This can maintain just 400 nits. No maximum brightness stated in the article. 400 nits isn’t even close to be able to display HDR content, whether creating, or viewing it.

For what it is, it’s expensive.

Link | Posted on Sep 22, 2021 at 22:02 UTC as 4th comment | 18 replies
In reply to:

ArashNiknam: even the title makes me laugh! comparing cinematography to an algorithmic rack focus

Well, commercial cinematography is beginning to apply these techniques as well.

Link | Posted on Sep 20, 2021 at 18:38 UTC
In reply to:

ShaiKhulud: I mean, the blur feathering on hair is noticeable even in promo movie. And algorithm missed DoF in car's windows in first seconds of a demo movie. (kudos to Apple, footage is clearly not doctored)

It's a neat little gimmick that like Live Photo will change totally nothing. Look at f/1.4 lens bokeh first, and at the cheap AI-blur after: the difference is still gonna be striking.

Only people who look for those pretty much minor defects see them. I’ve used portrait mode for years now, and except for earlier versions, people just don’t see the minor errors in hair, or tiny missing spots in blur, or occasional edge errors, etc.

The reality is that if no one told you what this was, you wouldn’t be looking for the errors, and wouldn’t see most of it, or even any of it. People see what they expect to see, and aren’t particularly observant.

When digital first came out, and I was working with photographers who came to my lab, the lack of grain was disturbing to some, and to some of their customers. We had to add some “grain” effects. After some time, people became used to the clarity and lack of grain that digital offered, and that complaint went away.

These effects will continue to improve. Higher resolution will make it easier for the software to recognize detail in the image, and the “problem” will go away.

Link | Posted on Sep 20, 2021 at 18:37 UTC
In reply to:

SDKat: I'm SURE Apple is thrilled with the subjectline of this story, while, REAL cinematographers may need to seek medical help to recover from the injuries they've suffered from laughing hysterically.

Apple footage is shot and produced by real pro cinematographers, including one who won two Oscar’s for her work.

Just in case you didn’t know, iPhones have shot academy award winning movies, Tv shows, advertising, both video and still, as well as professional news, documentary and photojournalistic work over the years, starting with the iPhone 5.

Maybe you should get out from under that rock.

Link | Posted on Sep 20, 2021 at 18:29 UTC
In reply to:

Nukunukoo: Remember when Samsung used a Pro-DSLR to claim that the video scene was captured on their newest phone?

Huawei did that too, and another company did as well.

Link | Posted on Sep 20, 2021 at 18:24 UTC

I dislike it when articles use words such as “revolutionize”. This is a good, and possibly a big step, forwards, but revolutionary? Well, digital photography and video was revolutionary. This is evolutional.

And I’m seeing criticisms from some people who should know better. It seems to work well enough for most people right now. But it’s certainly not perfect. No, you can’t use this (yet) to pull focus the way a pro camera and lens costing a minimum of twenty times what these phones cost. Like portrait photography when Apple first introduced it. But it’s much better now, and so will these new video features be in a few years.

We have to give things time. Digital photography took years until it became really usable, and digital video took a bit longer.

But Apple does try to give us things that are useful, and not overly gimmicky.

Link | Posted on Sep 20, 2021 at 18:23 UTC as 21st comment | 2 replies
In reply to:

Kostasm: I hope for prores settings that user can select between 4:2:2 or 4:4:4 color compression.

Even many pro level video cameras don’t shoot 4:4:4.

Link | Posted on Sep 19, 2021 at 17:25 UTC
In reply to:

Judy Stone: All these fancy names and gimmicks to make you believe it's alien technology for what in essence is an AI powered point and shoot integrated camera phone. No doubt Apple is the marketing master.

It’s more than that.

Link | Posted on Sep 19, 2021 at 17:24 UTC
In reply to:

JimKasson: "The triple-camera array on the rear of the device features a 6x focal length range, and includes a 77mm (equiv.) focal length (3x) telephoto module, a 13mm (equiv.) F1.8 ultrawide module and a 26mm (equiv) F1.5 wide module."

The focal lengths are FF equivalents, but the f-stops aren't, right? Isn't that stacking the deck?

Stacking the deck? What does that mean? It’s the normal way to describe lenses in and]y format. The f stops are equivalent in light gathering ability. So yes, they are the same, format size to format size.

As far as focus depth goes, it’s more complicated, but still there’s some equivalence. 13mm on full frame has a very deep focus plane, and it does also on a smartphone. Somewhat deeper on a smartphone.

Link | Posted on Sep 17, 2021 at 14:30 UTC
In reply to:

D_Gunman: Yeah. It's revolutionized for Apple to make 4k 30fps recording exclusive to the higher storage sku.

Yeah, just try to shoot 30 minutes in the format, and when you see the out of memory tag come up well before that, you’ll grouse about that too.

Link | Posted on Sep 17, 2021 at 14:27 UTC
In reply to:

Quickoo: How ist the macro capability with the wide and Tele lens? I think it’s more interesting then the UW lens. The digital zooming on iPhone 11pro when focusing too close is annoying.
Focusstacking internally would be interesting too.

Only the super wide has macro. That’s not surprising, it’s much easier to have macro with a super wide because of focal distance and focus.

Link | Posted on Sep 17, 2021 at 14:25 UTC
In reply to:

Welsh: Good grief… I have an ancient ‘phone that my kids laugh at. Should I get one of these Apple devices to replace my Canon R5?

That’s not a serious question, is it?

The cameras are for two different purposes. Most pros carry their “normal” camera and a top smartphone, usually an iPhone.

Link | Posted on Sep 17, 2021 at 14:24 UTC
On article Laowa Argus 35mm F0.95 sample gallery (DPReview TV) (38 comments in total)
In reply to:

white shadow: It goes to show manual lenses are not for everyone. Not everybody is experience in using them especially on a camera that expect to use AF lenses shooting at high fps.

For me the lens is a bit too heavy and big for a camera like the R6. Does anyone really need a f/0.95 max aperture ?

No. Not any more. When I was in my late teens, in the late 1960’s, we shot with slow film. Tri-X with Accufine, at 1200 ASA was about the fastest we could use and get a decent image, if you could accept the grain. High Speed Ectachrome was all of ASA 160 Tungsten, and 320 for daylight.

That’s why those 1.4 lenses were developed, and somewhat later, 1.2. But today, where ISO 12,800 can give beautiful noiseless images, .95 has no real use. The small difference in defined image plane is almost unusable. It’s almost impossible to focus that well.

Toda

Link | Posted on Sep 13, 2021 at 18:15 UTC
On article Laowa Argus 35mm F0.95 sample gallery (DPReview TV) (38 comments in total)

I love it: “Lots of character.”

The way that something with a lot of visual problems is described. And this lens does indeed seem to have a lot of problems. For the extra half stop, those problems have to considered to be the price paid. For the low price, the same thing can be said.

But there are far better, and much more convenient lenses out there. More pricy, and losing that half stop, which realistically, no longer means that much. But much easier to focus, which, after all is everything when shooting wide open with a fast lens. Much sharper, and with much less flare and better contrast too.

I don’t see anything “lovely” about these images. If you do, just find some old lens that still works on your camera.

Link | Posted on Sep 13, 2021 at 18:09 UTC as 4th comment

I’m curious about how much is real detail, somehow extracted, and how much is “detail” that the process invents because it’s “supposed” to be there.

For the photo of the man, how does it know, from that blur, that there is that reflective highlight on the eye? I don’t see any indication of it in the original photo. Additionally, you’ll notice that in all of the corrected images until the final one, that the reflection moves around and changes shape until the software believes it’s in the correct place, and following the assumed shape of the eye, vertically. That’s likely correct, if there is a reflection. But I’ve had photos I’ve taken with lights that had different shape reflections.

And then, were these blurred photos from photos that were originally sharp? If so, then there could be information left from the original sharp photos to guide the software. I’d like to see it done from photos we KNOW were soft of blurred originally.

Link | Posted on Sep 2, 2021 at 15:57 UTC as 5th comment | 1 reply
In reply to:

sneakyracer: They look like garbage at very low light levels lol

And it was the best you could get back then.

Link | Posted on Sep 2, 2021 at 15:30 UTC
In reply to:

Lan: "Anqing Lui says these three images were shot with LED lights that have matching CRI ratings, but which all produce quite different skin tones. This, he says, demonstrates the CRI measure isn't useful anymore"

Respectfully, I disagree. CRI is a very useful measure of the uniformity of the spectrum of light. The examples clearly show lights with different colour temperatures; you can get high CRI lights in a variety of different kelvin values - as shown in the example here. You just need to choose a high CRI light with the right colour temperature/tint for your application, or set your in camera WB correctly. CRI is profoundly useful for determining whether you'll get a nice spectra (and hence good colours) - cheap LED and CF lights are prone to very notchy spectra. My inner cynic suggests this light has a poor CRI, which is why they're slagging it off as a measure of quality...

CRI isn’t known as a reliable indicator of lighting accuracy or reliability. The problem is that it only measures a few parameters. These lamps are cheap. They are not continuous spectrum. That’s why different lamps with the same index can look different.

I have print boxes that use continuous spectrum lamps. Those do have an excellent color rendering ability. But the lamps are expensive.

Link | Posted on Aug 27, 2021 at 22:41 UTC

As someone who ran, with a partner, a large photo lab in NYC for over 20 years, I’ll say to not pledge anything here. Wait until this machine comes out, if it ever does, as a full fledged product, and has some sales time under its belt.

Even our equipment, manufactured by some of the largest manufacturers in the business, had some problems. This will have teething problems.

I don’t want to discourage them, but as consumers, people have to be more careful of their own position. A small company that buys a new complex piece of equipment from a new manufacturer, who may not have enough money to properly follow through, can easily get burned.

Link | Posted on Aug 23, 2021 at 15:17 UTC as 28th comment | 15 replies
Total: 723, showing: 1 – 20
« First‹ Previous12345Next ›Last »