During the launch of Apple’s new hardware and software yesterday at the World Wide Developers Conference (WWDC) 2019 an unreleased 8K Canon video camera was used to capture high resolution footage to demonstrate the quality of the new Pro Display XDR. The video camera was mounted on a robotic arm and was feeding the Apple display with 4:4:4 ProRes 8K raw video via an Atomos Shogun, according to tech YouTuber Jonathan Morrison who live streamed from the event.
During his video the Apple rep refers to the camera on the robotic arm as an ‘unreleased 8K Canon camera’ and from the clips it is easy to see it is designed in the style of the company’s C series. Interestingly, it appears to be mounted with a Sigma 18-35mm T2 in the EF mount. The relevant portion of the video is at roughly the 10:55 mark in the video below:
Canon has featured 8K demonstration cameras and displays in the past at its Canon Expo events, but this is the first time it has allowed the technology to be shown outside of the ‘showcase’ environment, and in a body form that we would recognize.
1. This apple vs windows arguments in every comment thread is getting old. I have worked alot on both Mac and Windows machines, my experience is that my windows machines are much more reliable. That is my personal experience and opinion. If you like mac better, that is fine. Each to their own. Trying to convert people from one "religion" to another with witty and demeaning internet comments will not work. :)
2. The monitors look sleek and modern. The Mac Pro, not so much. Is that just me?
Those chrome legs and handles against the aluminium grey cabinet are an eyesore. The cheesegrade front is also not that visually pleasing. If the case it self was matte black, the chrome would probably look more sleek but not sure that would be enough.
that is an excellent point .... matte black would have transformed the cheesegrater pattern into something futuristic , and importantly not directly visually reference [ for those with visual memory ] something used to grate cheese .... a silver rectangle with a pattern exactly replicating the holes of a kitchen grater is a visual mistake and something that cannot be unseen once pointed out
ive jonyey jones ive for all his talent cannot curate the wider implications of his visual invention , which is his major weakness , and apples , visually absent , visually clueless , stupidity , and its curation of important design decisions ,not worth a damn.
I think that there are some further moves here from Apple to keep Nvidia out. This isn’t that well reported, but there’s a feud between Apple and Nvidia which neither have really commented on in public. I think there have been several developments in the announcements from Apple this week that reinforces their position:
1. The new Mac Pro only comes with AMD GPUs. This is as expected as Apple stopped using NVIDIA GPUs around 2012 (the last 2 supported NVIDIA GPUs were the GTX 680 and Quadro K5000). 2. Apple Afterburner appears to be Apple’s equivalent to NVIDIA CUDA cores. 3. The Enhanced Gatekeeper in macOS Catalina will only allow Apple checked apps to be run. Since Mojave, Apple have stopped approving NVIDIA’s drivers for their latest GPUs and this is another potential way to block NVIDIA’s drivers.
... None of this will be of concern to the film studios that the new Mac Pro is aimed at. It’s a story that I’ve followed as I still have a 2008 Mac Pro running a patched High Sierra OS with a NVIDIA GTX 680 - it’s still a solid workstation to Apple’s credit. Apple have control of their ecosystem which has pros and cons, however they have let down their users of the previous Cheese greater Mac Pro by blocking NVIDIA’s latest drivers which I think is petty.
8K, you're going to be more concerned with viewing angle than screen size. in other words a headset or a theater makes more sense. Realistically though, 8K is needed to make good 4K output. Endstate, people are going to want to record at 2-4x the resolution they intend to produce.
@oveonite. At 100%, the text on a 32 inch 4k monitor is so small you almost need a magnifying glass to read it. At 150% it is normal and perfectly smooth so more pixel density is not needed unless you choose to live with your face 6 inches from the monitor. At 42 inches, a 4k monitor can be sensibly viewed at 100%, but the text is a little jagged (as it always is at 100%, so more pixels would be useful at that size. 6k would be plenty, so 8k is still overkill.
I said "almost". It depends on your vision, of course, but with 20-20 vision text is uncomfortably small at 100% in Win 10 at least. Don't know what a Mac does, but given the "helpful" nature of the Mac OS, it may adjust the text size automatically.
I measured the text on my screen for you. It's the same size as the text on the back of a credit card. I look at my monitor from approximately 18-24". Even without my glasses I can read it though my prescription is not that large.
"Can read" and "comfortable to read" are two very different things. If somebody handed you a copy of "War and Peace" with that size text, you would be complaining of eyestrain by the time you finished reading it (if you got there). The Windows default for a 32" 8k monitor is 150% and that is comfortable. 125% is usable, but tiring and 100% will wear you out if you spend a lot of time at the computer. My complaint with scaling is that images are also scaled, so unless you are using a program like LR or Photoshop (that doesn't follow the Windows rules), all your images will be scaled up and yo don't get the benefit of the high res screen.
@Foveonite. At 150%, the text is very sharp. A 32in 8k monitor would have to be scaled to 300% to have the same text size and that is simply a waste of pixels. At 43 in, you can make a case for 5k or 6k, but 8k is still overkill for almost all uses. That is not to say 8k wouldn't look nice, but remember, you will need a 33 MP 16:9 image to just fill the screen (without scaling).
Perhaps this is not so much an unreleased camera but a custom built model for Apple. Apple has the money to get what they want and it's probably not unrealistic for them to have a completely custom built device.
I can believe that Canon has a new 8K video camera coming out relatively soon. We know that Canon aims to have 8K equipment ready for the 2020 Olympics.
I can also believe that Apple has the clout to get Canon to allow them to use an unreleased model in their demos. In fact, it's smart for Canon to do so...look how it has people talking. And showing it off at an Apple event it will probably get more pre-release attention than most venues.
Apple did not commission a custom camera from Canon, they have had a 8K prototype that was first seen back in 2015 and has been further developed over the years.
Canon has talked about 8k for a while. Several years ago here in NYC, the Canon expo, which comes every two years, or so, had video from an 8k video camera. They used their own 4K monitors to show it. They zoomed in on the video until a subject we couldn’t even see wide out, was taking up much of the frame.
8K video doesn't directly assume the framerate or bitrate. Could be closer to streaming out a moving jpeg format from the sensor. As long as it's 7680 pixels wide and sends 15 frames per second they could get away with calling it 8K.
I keep seeing people commenting that you can build a PC to match this computer but I think that this misses the point of who this computer is intended for.
Reality is, most creatives needing a machine like this are not computer engineers or even computer enthusiasts. They are photographers, graphic designers, 3D animators, videographers, and so on. I am a couple of these and after reading the components list of what is offered it came off as another language to me.
Apple buyers don't want flexibility, they want usability through efficiency. They want the most fully integrated user experience available, they want no hassle buying. They want to create and not be bogged down by complex tools that more often than not distract from the user experience that too many options create.
Also, I doubt you can duplicate this computer and monitor plus the user experience on any DIY PC for the same price.
well, then how you decide on what configuration you're going to buy ... because the lowest proposed by apple has some bad joke, and if you don't care about what SSD, CPU and other acronyms mean, you end-up with an inflexible super expensive low-high end mix computer, which will definitely bring some frustrations down the way.
Well, I imagine very few individuals are the customer base. These are meant for studios and there will be someone there who has the knowledge regarding the tech but the actual creatives won't care so much.
I also think you're overestimating the amount of knowledge what must have to pick out a computer to fit their needs.
A more relatable analogy would be someone buying a Sony A9 for wildlife photography based on the stated specs and reviews. This person doesn't need to know who makes the components or the model numbers of the sensor, etc. We trust that Sony has put together a camera that will do what it says it will do.
That's the selling point of Apple and Dell and HP.
Apple products just work and work well. I remember getting rid of Windows years ago. Goodbye to virus attacks, antivirus subscriptions, system crashes, etc.
My macs have been a trouble free joy to use. Best decision I ever made with regards to technology.
@thenoilif "I keep seeing people commenting that you can build a PC to match this computer but I think that this misses the point of who this computer is intended for." Complaints about the lack of configurability of the Mac Trash Can Pro is exactly why this one is upgradeable. And whilst the average creative isn't upgrading their computer no matter the system, Apple have to address several user tiers with one model. You are creating a more uniform picture of "Apple creatives" than truly exist. They range from those pray in the direction of Cupertino or have a shrine to St. Jobs to performance junkies. Actually, the user you describe will most often opt for the iMac instead of the Pro. No reason to venture towards the Pro if performance, and an understanding of it, isn't the point.
My point was simply that if one buys into a configurable anything (and the people in studios/companies that use computers bought/configured by others are not the case - they are users of the system yes, but basically have no say in the acquisition. If the boss decides to switch on Android video editing they either accept or change job) then he has to know what he buys, or ask help from someone who knows. And that includes Apple products as well as anything else.
@thenoilif "Well, I imagine very few individuals are the customer base. These are meant for studios and there will be someone there who has the knowledge regarding the tech but the actual creatives won't care so much." Whilst there are studios that have separate IT and creative departments, there are many where they cannot afford a complete separation of function but still need high performance machines. And my experience with larger graphic oriented companies is that IT often don't know what creatives need and the creatives have to educate IT.
Hp has a similarly configured model that’s $8,000, and Dell has one for $9,000. It’s interesting, but for years, Apple’s top machines actually come in lower than comparable machines from others.
But Apple has some unique features such as the accelerator board for transcoding and other high end video use that does 6.3 billion pixels per second that no one else has right now. That’s 3 streams of 8k video, or 12 streams of 4K.
@melgross I’d have to see that to believe it. Specs can be deceiving if one doesn’t understand them. And Apple have consistently been higher priced when comparing like for like. Performance is the ultimate comparator and that is a little less straightforward. That said, the ultimate performance will elude Apple as long as they are wed to AMD graphics, one of the more common complaints from power users
@Thenoilif: Interesting analogy between Apple “Creatives” and Sony “Creatives”. I could construe your comment to mean that neither “Creative” can be bothered to understand their product beyond the most basic of levels. Which is fine, but also means their opinion is about as informed as an earthworm. People can call themselves “Creatives” as much as they want, but they should not be surprised if they occasionally see people making hand gestures akin to shaking dice behind their backs.
Please don’t put words in my mouth. I was very clear on what I meant.
To clarify even more, if I was a pro wildlife photographer, I would look at performance specs of the camera to determine if it was the right investment for me. I would see 20 FPS no blackout as an appealing function from a camera. I would need to be bothered with the mechanics of how it accomplishes this as long as it does what it says it will do. If I buy it nd it doesn’t do that then I return it. If I buy it and at some point this functionality stops working then I return it and get it repaired. I don’t need to know how it works because I am not concerned with how to fix it myself.
This mentality is shared by people who don’t want the overly complex nature of modern technology getting in the way of their creativity. If DPR is any gauge, you can see how technology can interfere with the creative process as people obsess more about the machine than the actual input/output.
After 1 ½ days of thinking about the new Mac PRO (having owned 3 predecessors including the water cooled G5) I have to say that I slowly understand the niche it suits in.
Massive computational power to get today's work done that no other computer can do at this point in time.
Looking 2..3 years down the road it is obvious that technological progress will enable completely different tools that are much more affordable and Apple will be one of the first to implement them.
I guess it is a save bet that we'll see a 8 k 32" iMac PRO at some point in time with similar computational power and the latest in GPU technology.
The "normal" high End user will be very happy with his current iMac PRO at a fraction of the cost and a bit more time for completing the computational intensive work.
I am still a bit disappointed by the lack of inspiring new design. When I first heard the idea of a new modular Mac PRO other form factors came to my mind.
The future is exactly the point of this machine. It really is not about serving graphic designers as much as high intensity tasks like 3-d and video. Given OS Xs past in NextStep it also easy to see immersive environments and data mining as targets.
Looks like you need to see the design up close, I have seen some pictures that make it look like those holes are actually hollow spheres, not just some cheap mesh. And I als think this is an interesting look into the future.
At $5,999 for the base model with a 256 GB SSD and an upper limit of 4 TB of storage (according to the Apple site), I think that might not be the best idea…
I tried Windows last year after 18 years of using MacOS.
At first I thought this Windows machine would work great for me. However the first problems occurred just 3 month after I bought it, then within 6 month after purchase that whole Windows machine fell apart by some stupid Windows update. Lost 2 working days to get it restored.
You guessed it - Sold it again.
Happily to buy a new Mac again. Live your dream that Windows is better.... Its just not my experience.
MacOs is better than Windows in the same way that for the average joe, a smartphone is better than a real camera because he doesn't know how to properly handle a real camera.
I've used Windows for all my life and forced to used iMac for 2 years while working as graphic designer for a company, couldn't stand how painfully limited and uncustomizeable MacOS is compared to Windows. My uncle and sister use iMac for gaming and the damn things overheat and throttle all the times. My PC on the other hand is rock solid, never overheated, never lost data and when I want to upgrade something I just swap out the part instead of being forced to buy a whole new computer.
Grapejam - Creative professionals need tools to get their creative products out. They don't need to tweak their computer and want to spend time on making settings that are useless to them.
As a creative I want to have fun and want to play. Open up Capture One and Photoshop. Need to work on Illustrator or Indesign project.
I am not bothered with the technique that sits in the computer. I am not interested in the system preferences panel and what I can tweak.
It just needs to work and that is what a Mac does! It immediately gets me up to speed and start working on my creative projects. - No pain in maintenance.
If you see fun in tweaking the system and constant updating to keep your computer running then I think Windows is probably the best choice.
For those who just want to work with them without hassle and fuzzing around with the system to avaoid blue screens and system crashes Mac is better.
"Creative professionals need tools to get their creative product out. They don't need to tweak their computer and want to spend time on making settings that are useless to them." Keep telling yourself that.
Funny, because aside from the initial setup, I never have to constantly tweak my windows setting. Mind telling me what do you have to constantly tweak in windows because I've used windows for far longer than you and yet can't remember having to. When I used MacOs I actually had to fiddled with around with MacOs whenever I want to customize something way more than I have to with windows.
Just buy a good antivirus software like ESET and be done with it.
And never had problem with windows updates, just set windows update to when shutdown after you've done all the works and go to sleep.
Microsoft has deleted the “customizability” that made Windows more usable. Where is the color-scheme editor that allowed you to set up a system-wide color scheme from 1991-200x? We didn’t have to wait 30 years for a vendor to dribble out a hard-coded “dark mode” on Windows... but now we do?
Problem with windows is windows update. After I installed w10 I have spent 1 hour to remove all that junk (metro apps and more) After one of the updates all that junk was back so I had to do everything one more. And every now and than internet explorer makes a return. I really don't like that windows is doing shi* behind my back
Ignorance of how a system works is no critique of the system. Macs are so much more powerful once you learn how. There is nothing a PC can do that a Mac can’t. But it’s not so easy for visa versa.
@ewelch - you started your comment well: "Ignorance of how a system works is no critique of the system" but continued with exactly the opposite :))
Simple truth is that all things can be done on both systems. And it's fine to prefer one or the other. And "Macs are so much more powerful once you learn how." and "[...] not so easy for visa versa." is exactly the ignorance you talk about.
Once again we simply get a flamer post that does nothing to advance any debate, and I'm talking about the original post by GJ which is simply juvenile.
Why does it always have to be one extreme or the other? Why is it either one brand totally is awful while the other is flawless?
I don't like Windows myself, but I realize that some do and that it serves their purposes. Why not accept that Macs also have value to many users and leave it at that?
It's these types of polarizing debate comments that prevent real discussion, learning and progress. It's a sign of lack of maturity. Real life is complex and we need to get comfortable with that.
Opinions are like cell-phones—everyone has one ;-) Here's what award winning Director of Photography Ben Allan ACS CSI has to say: "For anyone working with massively complex projects or very high resolution this machine makes things possible that simply weren’t possible or at least practical before." https://www.newsshooter.com/2019/06/05/hands-on-with-the-28-core-mac-pro/
He got to play with four different Mac Pro/XDR systems running Da Vinci Resolve 16, FCPX, Pro Tools (6 cards installed) and Logic Pro. Read the News-Shooter post—very interesting and informative.
The new Mac Pro is an incredible machine and really a bargain for what it offers.
Bravo to Apple!
Also interesting to see the new 8K Canon in a demo at the event...Apple has enough clout to merit Canon loaning one of its unreleased units to Apple.
Also, Nikon Rumors has an interesting speculation: during the Keynote, they showed 8K footage for a documentary by Ami Vitale, who is a Nikon Ambassador. It's likely that anything done by her would be filmed on a Nikon.
So this Apple even may have inadvertently hinted at new 8K machines from Canon and Nikon!
That comment makes no sense. How did Apple fail to give credit to Canon or Atmos? This is a demo setup in dome room at WWDC; not the Keynote presentation.
They clearly allow people to see the equipment used; they are not hiding it. In fact the point of the demo is to show the equipment in a real world type scenario working with other equipment.
You guys belive he was using the 5.999$ spec sheet mac pro? The only thing in performance that the machine he was using and the 5999 one is the mac badge
If it's the transportation that is required to get you to work each day, be it white or blue collar, it's pretty much making you every cent you manage to bring home during that job.
If it's the transportation you're using during your Lyft gig, it's making you every cent that you manage per mile.
I'd say panel is made by korean Samsung or LG. Monitor itself is made in China by Foxcon or other chinese contractors. It is overpriced but not that much compared to pro NEC or Eizo stuff.
It is so much more than anything NEC or Eizo does for anywhere near the price. Reports are that it clearly is even better than the $43,000 monitor Apple mentions because it can only maintain 1,000 nits for six seconds, while the Apple display can run at 1,000 continuously and go to 1,600 temporarily.
Hahaha...that Dell monitor doesn't even come close to matching the quality of the Apple one...for instance, Dell has 1300: 1 contrast ration, Apple has 1,000,000: 1 contrast ratio.
Also, I bet that Dell one won't last nearly as long as the Apple one.
the mac pro comes with intel processors 2 gens old and no AMD threadripper cpu option , despite apples AMD GPU addiction... threadripper is an exciting option unavailable to apple users
i also dont understand and will never understand how a machine that claims to be pro and will cost between 6000.00 and 45,000.00 usd requires users to add a module to get an audio jack
Gosh, maMamiya, majority of photographers and video content producers use Macs. Bet half of the attendees are. They sat there. Watching the videos so are the rest at home. Nobody shrieked "EUREKA !!! That is 8k !!!" until DPR published it was Canon Unreleased video camera.
Nikon should have done the same on their Nikon DL Advanced Compact Camera Unreleased Version.
That goes to show nobody really can tell a 4K to 8K videos not until told so. That is life. That is psychology-at-work.
Mariano brings up a good point. Apple is not going to trust showing off their Mac Pro to any video rig unless it is absolutely reliable, impressive, and ultimately trustworthy itself.
Therefore that new Canon must be, one, a really great camera, and two, far enough along in development to be stable enough to show at the event.
None of that makes any sense. There was no “trusting” of anything to anything. A crew with a Canon recorded 8K footage to Atomos recorders well ahead of time. Then Apple played the file back.
Translation: Apple did essentially none of the work, and punked out on giving credit to those who did.
Many cameras are tested "in the wild" before their existence is officially announced. What's so strange about Apple using an unreleased Canon camera before DPR knows about it?
Not sure how they actually captured it. Only one cable is coming from the camera. If it is an SDI cable, you need 4 streams to capture 8k. Or was it an HDMI cable? Then you need HDMI 2.1. The Shoguns only have HDMI 2.0.
A single 12G SDI cable is not fast enough for 8K. You need bandwidth of 24G for 8K24 at 4:4:4 or at 4:2:2 it would be 18G. Way more than the current 12G SDI.
If they shoot RAW they only need one SDI cable. Someone on here just explained that to me. "While it is 12bit it is only one value per pixel insted of three. So in uncompressed form RAW is actually smaller than a demosaiced picture allowing for those data rates"
Sony user: Canon sucks, nothing but dated sensors and recycled bodies behind on technology. Sony user: That being said I do adapt Canon lenses. Also I miss their ergonomics, menu system, swivel touch screens, WiFi Bluetooth connectivity, GPS, flash system, color science and service - but outside of that they are just so behind. Canon user: Just Smiles
Also people on this site don't know the awesomeness that is Canon Cinema. Some of the easiest cameras to use with really solid IQ. People still shoot on C100's and they're outdated spec wise.
Goodness! 8K! I'll wait until it goes out in the market. I hope it wouldn't be that expensive considering product placement cost. Apple takes money seriously.
@SteveAnderson -Good point. The same can be said for being able to see 15 stops of dynamic or the resolution of a lens that can resolve a 50mpx sensor.
At least we found out who is still buying Canon cameras. Joking aside Canon loves this, while Panasonic is trying to put this in their mirror less cameras, Canon is dead set on keeping an artificially high priced line up of cinema cameras.
“You’re an idiot” actually means “I can’t find a good argument to counter yours, so I’ll just insult you instead”. Often when people don’t share the same view point they turn around and say that the other person is an idiot.
They need to upgrade the current 2013 Mac Pro to 2019 specs and keep the $2,999 base price since $5,999 base is way too high for consumers. I had the 2008 Mac Pro cheese grader and prefer the 2013 trash can since you could use an ultra fast external Thunderbolt drive for 4K and 8K video. https://eshop.macsales.com/shop/owc-envoy-pro-ex-ve/thunderbolt-3
I don't want a monitor with computer built in. I use dual 30" 4K monitors with my 6 core Mac Pro . There are many consumers using the current Mac Pro and I prefer it to a tower case like my old 2008 Mac Pro.
The Mac Mini uses mobile Mac /PC cpu's and graphic chips that are much slower than the cpu and graphics in the Mac Pro. I have an older Mac Mini and am satisfied with the performance of my current Mac Pro. Apple need something between the Mac Mini and the new Mac Pro that starts at $6.000 and goes to $40,000 loaded. The current 2013 Mac Pro with 2019 specs for a base price of $3K would do the trick.
No, you are mistaken. The 2013 Mac Pro only has USB 3.1 gen 1 and Thunderbolt 2.. Half the speed of Thunderbolt 3. Apple quit making them because they couldn’t handle the heat faster GPUs would generate and couldn’t be made adequately powerful.
The new Mac Mini can drive a monster GPU via Thunderbolt 3. The new Mac Pro is underpriced according to real pros out there who are the target market for the new machine. Compared to the competition that is.
Its fine you enjoy your cute computer but making up silly stories is silly.
By the time that computer can actually do any work it will be spread all over your desk with attachments like graphics cards and HDDs.
The competition is the same as always, much more compatibility, better cooling, infinitely more versatility, longevity, for half the money.
The only reason Mac is getting better on price than the old days is because they use Intel and AMD just like the competition. Rebranded monitors, proprietary ports and an apple keep the price falsely inflated.
Again its fine if you prefer it, but dont make up stuff that is easy to debunk.
Not one single thing you said about Apple is true. It works better, and faster because of the ease of automation, than anything. What proprietary ports?
The 90s called and want your tired arguments back.
You like Apple, good for you. Enjoy. I didnt say they were bad. They do what they do well. They just dont do anything else, and price/performance is an issue still.
They don't do anything else? You really don't know anything about Apple. Ever hear of iTunes? FileMaker Pro? (Database) They're putting a billion into movies and tv for their new service coming this fall. They make phones. They make watches. Holy smokes, you need to get your head out of the 90s.
Cmon man, you are such a fan that your reading comprehension is skewed.
Mac computers dont do anything else means:
They dont like to run the other 90% of software, What if you want to use a much better program than iTunes? Dual boot to Windows.
Including games and the as you say 'real pro' software. (Emulators dont work well so dont count, dual-boot means a Windows install and you can dual boot macOS on a PC as well)
And i am not putting them down. Look above your first remark, I never put them down. I said they are good at what they do.
You went too far with your "real pros use a mac mini" talk.
Allegedly 8k is an option on the next C300, which is some long way (C500, C700) away from being their high end, so it may be somewhat affordable, in Cinema EOS terms anyway, well unless you buy the optional MIA robot... Also apparently 444 ProRes 8k Raw recording,
Horshak, all the video experts out there are gobsmacked at the statistics of these monitors. Time will prove whether the numbers are accurate. But the people who have actually used them already (professionals that Apple let use beforehand) all say they are without peer for anything anywhere close to the price. And even compared to the $43,000 Sony reference monitor which can only hold its peak light level for 6 seconds (Apple's can go indefinitely at the same level and go up to 600 nits higher for short periods). Chances are they are going to be bought in massive quantities.
@ewelch, The Apple monitor has eye-catching marketing specs but falls short where it matters most, specifically using a multi-segmented LED backlight instead of OLED. That's what gives the monitor its impressive brightness but at the expense of uniform brightness and contrast, which are essential elements on a reference monitor.
Horshack, that is not an OLED monitor. You need to do some research. It's using blue LEDs and converts that blue light to white.
"2D backlighting system using 576 full array local dimming zones Apple-designed timing controller (TCON) chip engineered to precisely control high-speed modulation of both 20.4 million LCD pixels and 576 LEDs in backlight for seamless synchronization True Tone technology with dual ambient light sensor (ALS) design to ensure an accurate viewing experience in any ambient lighting condition."
Horshack, That is specifically the monitor Apple is comparing their monitor to. It can only sustain 1000 nits for six seconds. So expensive it can only be the last monitor in the workflow, whereas Apple's monitors are so inexpensive everyone can have a reference monitor rather than just the last person.
@ewelch, And as I said, the Apple monitor uses multi-segmented LED backlights to achieve that brightness, at the expense of brightness and contrast uniformity offered by OLED and required by pros in a real reference monitors. The Apple specs are impressive but they are consumer specs and not suitable for critical reference monitor work.
Horshack, Such claims are dated and no longer apply if what Apple and the pros who have used it know what they're talking about. OLED is not necessarily superior. That's how it's been done to date. And apparently it's no longer state of the art unless by that you mean the most expensive. OLED has its own share of issues that can't be ignored.
@ewelch, Claims are dated? How so? I'm not talking about OLED in general hype terms but specifically how the technology makes it the current preferred choice for reference monitor usage. If you put an OLED next to a multi-segmented LED backlit monitor you'll see obvious difference in uniformity. There's no way to escape that - the segments of the LED, no matter how numerous, are coarse-illuminates for the pixels they are designed to turn on.
Horshak, Yes you are talking about OLED in general. Have you even bothered to read what Apple has said about how they have designed their monitor? Because it's not your typical LED backlit monitor. There are 10,000 blue LEDs, each individually controlled and passed through some kind of new filter to make the light perfectly neutral in color and even in distribution at such high levels. This is not your granny's LEDs.
@ewelch, I've read it. It's still a multi-segmented LED backlight monitor. It doesn't matter how many segments it has - it's still far coarser than having each pixel self-illuminated as in OLED. All those extra segments do is reduce the coarseness. Again, consumer technology vs professional.
@ewelch, My evidence is basic logic. A multi-segmented LCD display illuminates and blocks groups of pixels, the number of which is based on how many backlight segments the display has. The more segments, the more selective the illumination of pixels can be, which improves uniformity and local contrast. Compare that to OLED, where each pixel is self-illuminating, which means it has perfect selectivity. A multi-segment display can never match OLED because no matter how many segments its backlight has it will never come close to OLED's individual pixel selectivity, let alone match it.
Your opening sentence on that comment discredits what you say. Logic can only go so far without hard evidence to back it up. You have not even touched Apple's monitor. You can't objectively say anything about it. Thus your own statement discredits itself.
@horshack - his observation is correct. You seem more concerned about hating on Apple for the sake of it than to acknowledge that Apple may have obliterated the reference monitor market if this does indeed work as advertised.
I think the desktop is about to make a bit of a come back, I just put another 32GB of ram (64GB in total) in my linux box to make processing drone footage easier and actually 1.5TB seems almost reasonable. Managing this stuff on a laptop or the cloud, atleast for me, has always been a little stupid, now it's very stupid.
Working on cloud for such work is a joke and many companies are pushing for cloud. At least for the next 2-3 years I don't see cloud based systems replacing local desktop systems for heavy works.
first I use matlab for everything from echosoundings to video footage, I do alot of work interchanging time and space and although with careful memory management I get away with less than 1 GB of ram it's much easier with more, the more time and space I can load into ram at a time the broader the sprectrum of time and space I can analyze. Your average photographer could learn how a computer works and use less than 1 GB. Today I am looking at drone footage a private company collected for me (1.2TB of 30MB geotifs) of a frozen lake, I want to be able to look at all scales of variance from the whole lake (several km trends in ice color) to a cm scale holes. basically I want to be able to look at the whole 1.2TB composite with the ease of looking at a single 30MB image.
If you go to google scholar and search for: Tedford Holmboe, the first paper in the list has a good example of what I do in the lab, specifically in Figure 7 I separate rightward and leftward propagating internal waves using a time-space 2d Fourier filter (some details on the imaging are in the third to fifth paragraph of section 3). Lately I've been working on pit lake, that's where the drone footage comes. there's a couple of papers in the google scholar list for my name, but nothing yet with the drone footage.
Or buy a $100 VESA stand and attach the Apple monitor to it. Not as flexible for movement and putting it in the perfect position and angle, but then you saved $900 on a monitor that beats $50,000 monitors. Yeah, that makes sense.
@jay jay02, Super35, NOT FULL FRAME is the Hollywood standard. Like most Canon Digital Cine cameras, the 8K seems to be a crop camera—that uses the full Super35 sensor.
The Olympus OM-D E-M5 Mark III is the company's latest Micro Four Thirds camera; it looks great, feels great, and takes beautiful photos. There's a lot to like, but there are a few things we'd change. Find out more in our full review.
Sony's flagship APS-C camera, the a6600, is a refinement of its predecessor and now includes industry-leading autofocus and battery life. But is that enough to earn it top marks? We think Sony could have pushed the boundaries a little further - find out how in our full review.
With the EOS M6 Mark II, Canon has taken its midrange mirrorless game up a notch. Offering the highest-res APS-C sensor on the market, 4K video, super-fast burst shooting and comfortable ergonomics, the M6 II is compact and a real pleasure to shoot with. Get all the details in our full review.
The PowerShot G7 X Mark III doesn't stray from the formula that made its predecessors well-liked, at least on the outside. Its feature set, however, has been greatly improved, with live YouTube streaming being the highlight.
The Fujifilm X-Pro3's new viewfinder, new screen and titanium construction all make for an appealing camera, but perhaps only for a certain type of photographer.
If you're looking for a high-quality camera, you don't need to spend a ton of cash, nor do you need to buy the latest and greatest new product on the market. In our latest buying guide we've selected some cameras that while they're a bit older, still offer a lot of bang for the buck.
What’s the best camera for under $1500? These midrange cameras should have capable autofocus systems, lots of direct controls and the latest sensors offering great image quality. In this buying guide we’ve rounded up all the current interchangeable lens cameras costing less than $1500 and recommended the best.
Looking to get in on the instant camera fun? We tried every model and think the Fujifilm Instax Mini 70 strikes the right balance between price and feature - the Instax Wide 300 is our choice if you crave a larger format.
Long-zoom compacts fill the gap between pocketable cameras and interchangeable lens models with expensive lenses, offering a great combination of lens reach and portability. Read on to learn about our favorite enthusiast long zoom cameras.
The Laowa 15mm F2 Zero-D is a fast, wide prime lens for full frame mirrorless cameras. We used it for astrophotography and architectural photography – a couple common use cases for this type of lens.
Our 2020 'Best and Worst' episode is here! Will Jordan's art house movie trivia drive Chris to madness? Can you make martinis with pickled tomatoes? Find out what gear the boys from Calgary liked most – and least – this year, and learn a few things you didn't know you wanted to know.
We're taking a look back through 2019 by focusing on one of our all-time favorite photographic subjects – Belvedere, a rescue pup who joined the DPReview team last fall and graced many of our sample galleries over the past 12 months.
There are now seven RX100 models currently on the market that you can buy. Should you save a few bucks and go with less than the latest-and-greatest? Find out which one is the right fit.
A front focusing issue appears to have hit Canon's new 70-200mm F2.8 lens for the EOS R system, with users reporting problems at close focus distances. Update: Canon has issued an official statement acknowledging the issue, and is preparing a firmware update to address it.
The compact form factor, fast and versatile lens, and pop-up viewfinder made Canon's PowerShot G5 X II an easy pick for DPReview editor Jeff Keller's favorite camera of the year.
Twitter has added support for Live Photos on iOS devices and improved its compression algorithm for JPEGs, which should dramatically improve image quality.
Leica has unveiled a 'Ghost Edition' version of its M10-P priced at $14,995, as well as the new Summilux-M 90mm F1.5 ASPH telephoto prime lens for portrait photography.
FeiyuTech wants to want to cover every base with a quartet of gimbals covering action cams and phones to full blown DSLRs. The timelapse one is interesting
The XCD 30mm F3.5 is a wide angle prime lens for Hasselblad medium format cameras, and with a price tag close to $4000 you expect it to be good. As Chris and Jordan discover, it's an outstanding optic – with interesting bokeh.
This week DPReview TV tested the Hasselblad XCD 30mm F3.5 wide angle lens for medium format. Here's a gallery of full resolution images from this episode, shot on the Hasselblad X1D II.
A late 2019 edition to Nikon's Z series lens lineup, the 24mm F1.8 has made its way into our hands. We tried it out on the full-frame Z7 as well as the APS-C Z50.
The DJI Mavic Mini is so light – a mere 249g – that it's exempt from FAA registration in the United States. Does that mean it's also exempt from FAA rules? Here's what you need to know.
Per a report from CNBC, a number of former Facebook employees have spoken out that Instagram's recent transition to hiding the like count of posts was more than just a good-will gesture.
The Olympus OM-D E-M5 Mark III is the company's latest Micro Four Thirds camera; it looks great, feels great, and takes beautiful photos. There's a lot to like, but there are a few things we'd change. Find out more in our full review.
The arrival of an EF-M version has given Technical Editor Richard Butler even more opportunities to shoot with the Sigma 56mm F1.4 DC DN. It's the fifth time he's picked an APS-C product.
Repair website iFixit reports that Nikon has contacted more than a dozen independent repair shops that they will no longer be a part of its authorized repair program after their agreement is up in March 2020.
Kodak Alaris confirmed Ektachrome E100 will be available to purchase globally in 5-roll 120 'propacks' and 10-sheet boxes of 4x5 film 'within the next 10 days.'
Adobe has added direct photo importing and advanced photo exporting to Lightroom for iOS and iPadOS, as well as updated the shared album feature in its desktop apps.
Comments