4K from camera to tablet

Nurz

Member
Messages
17
Reaction score
2
Hello, I'm thinking of the possibility to use a Windows tablet for 4K monitoring and recording at the same time. Specifically, to record 10-bit 4:4:4 video on-the-fly, with the encoder of my choice - not be limited by those provided by expensive dedicated recording boxes. Since tablets so far do not have HDMI input, theoretically, passing the HDMI-out of the camera to the tablet could work through an HDMI-to-USB3 adapter. Has anyone tried this workflow? Does the adapter affect the speed of the signal somehow? I assume tablets equipped with a fast SSD, now meet the requirements for this task. What do you think?
 
Last edited:
Hello, I'm thinking of the possibility to use a Windows tablet for 4K monitoring and recording at the same time. Specifically, to record 10-bit 4:4:4 video on-the-fly, with the encoder of my choice - not be limited by those provided by expensive dedicated recording boxes.
I assume tablets equipped with a fast SSD, now meet the requirements for this task. What do you think?
Wow, you are funny!
 
Hello, I'm thinking of the possibility to use a Windows tablet for 4K monitoring and recording at the same time. Specifically, to record 10-bit 4:4:4 video on-the-fly, with the encoder of my choice - not be limited by those provided by expensive dedicated recording boxes.

I assume tablets equipped with a fast SSD, now meet the requirements for this task. What do you think?
Wow, you are funny!
Actually, a Microsoft Surface Pro 3 fully loaded would probably be capable.

In fact, it is. I just loaded some GH4 4K video onto my Surface Pro 3 (fully loaded) and it played flawlessly.

The surface Pro 3 has a full size USB3 port and runs a full version of Windows 8.1, so I would think if you could get a computer to act as a monitor, it would also work on the Surface Pro 3. No idea what hardware/software would be required though.

--
The greatest of mankind's criminals are those who delude themselves into thinking they have done 'the right thing.'
- Rayna Butler
 
Last edited:
18613653773_a4cbb3430f_o.png


http://business.panasonic.com/toughpad/us/4k-tablet.html

I would get a 4K laptop instead IMHO.

19208354356_ba1125c02b_o.png


 
Last edited:
Hello, I'm thinking of the possibility to use a Windows tablet for 4K monitoring and recording at the same time. Specifically, to record 10-bit 4:4:4 video on-the-fly, with the encoder of my choice - not be limited by those provided by expensive dedicated recording boxes.

I assume tablets equipped with a fast SSD, now meet the requirements for this task. What do you think?
Wow, you are funny!
Actually, a Microsoft Surface Pro 3 fully loaded would probably be capable.

In fact, it is. I just loaded some GH4 4K video onto my Surface Pro 3 (fully loaded) and it played flawlessly.

The surface Pro 3 has a full size USB3 port and runs a full version of Windows 8.1, so I would think if you could get a computer to act as a monitor, it would also work on the Surface Pro 3. No idea what hardware/software would be required though.

--
The greatest of mankind's criminals are those who delude themselves into thinking they have done 'the right thing.'
- Rayna Butler
But does it have an HDMI-in port - the only way to monitor/record GH4 4K 10-bit output AFAIK. Many tabs have HDMI-out, but can't say I've seen one with an input. Maybe with an HDMI > USB3 converter if that exists??

Pete
 
Last edited:
A very late reply, but anyway...

I appreciate your comments (Francis, thanks, I enjoy myself as well)! After another discussion I had about this, it seems it may actually work if you compromise to 4:2:2. People claim the tablet would need four physical i7 cores, though - a hyperthreading dual core system won't suffice. It also seems that USB 3 is marginally capable for the task.

Now things get a bit more complicated as there's actually one Windows tablet that has HDMI-in, the Wacom Cintiq Companion 2: http://www.wacom.com/en-us/products/pen-displays/cintiq-companion-2. The tricky part is where the website mentions "HDMI input when attached to PC". I emailed Wacom, asking them about DSLR compatibility, and after a month they got back to me: "Technically speaking it should work, we didn't test this as a thing, but I don't see a direct reason why it shouldn't." I don't know what to think about their response, but maybe there's a good chance to be able to connect with the camera, anyway.

All in all, I don't think it's an impossible idea.
 
Last edited:
Hello, I'm thinking of the possibility to use a Windows tablet for 4K monitoring and recording at the same time. Specifically, to record 10-bit 4:4:4 video on-the-fly, with the encoder of my choice - not be limited by those provided by expensive dedicated recording boxes. Since tablets so far do not have HDMI input, theoretically, passing the HDMI-out of the camera to the tablet could work through an HDMI-to-USB3 adapter. Has anyone tried this workflow? Does the adapter affect the speed of the signal somehow? I assume tablets equipped with a fast SSD, now meet the requirements for this task. What do you think?
HDMI supporting 4K video (nominally 18 Gb/sec) demands much higher bandwidth than USB 3, and no simple converter / adapter exists or will exist unless the content is first compressed. The Atmos Shogun for example captures 4K HDMI output video from cameras like my Sony A7S or GH4 and compresses, then stores content at about 400 Gb/hour (roughly 100 Mbits/sec) after compression, on an SSD. The compressed video has been shrunk by nearly a 200:1 ratio before storage, just like the typical 100 Mbit/sec rate used for internal 4K camcorder SDXC card storage. This Atmos compressor recorder is a $1800 devices, and way beyond the notion of a USB class cable / adapter device. Going to 4:4:4 would demand even more bandwidth.

Processing the uncompressed 4K video or doing I/O at this uncompressed rate is VASTLY beyond the capacity of any tablet or other PC input. On the other hand, PLAYING 4K content from an already compressed transport stream file (.mts or .m2ts) or mp4 wrapped file is within the realm of tablets with multicore fast CPUs in GPU boosted tablets, so playback of already compressed content is possible.

Larry
 
Last edited:
HDMI supporting 4K video (nominally 18 Gb/sec) demands much higher bandwidth than USB 3, and no simple converter / adapter exists or will exist unless the content is first compressed. The Atmos Shogun for example captures 4K HDMI output video from cameras like my Sony A7S or GH4 and compresses, then stores content at about 400 Gb/hour (roughly 100 Mbits/sec) after compression, on an SSD. The compressed video has been shrunk by nearly a 200:1 ratio before storage, just like the typical 100 Mbit/sec rate used for internal 4K camcorder SDXC card storage. This Atmos compressor recorder is a $1800 devices, and way beyond the notion of a USB class cable / adapter device. Going to 4:4:4 would demand even more bandwidth.

Processing the uncompressed 4K video or doing I/O at this uncompressed rate is VASTLY beyond the capacity of any tablet or other PC input. On the other hand, PLAYING 4K content from an already compressed transport stream file (.mts or .m2ts) or mp4 wrapped file is within the realm of tablets with multicore fast CPUs in GPU boosted tablets, so playback of already compressed content is possible.

Larry
I understand. Maybe things will change in the near future, but that's another story. How I see it now, you're not only unable to record at 4K, you can't go down to 2K from a 4K signal either, in fact that would require even more processing power.

I digress from the original topic but, for the sake of conversation, I'd guess high-end tablets can at least handle data from 2K cameras, right? Wouldn't that would also result in better quality recording? True 1080 lines instead of 800 or so DSLR cameras record internally (unless you're going with Magic Lantern on a Canon).
 
HDMI supporting 4K video (nominally 18 Gb/sec) demands much higher bandwidth than USB 3, and no simple converter / adapter exists or will exist unless the content is first compressed. The Atmos Shogun for example captures 4K HDMI output video from cameras like my Sony A7S or GH4 and compresses, then stores content at about 400 Gb/hour (roughly 100 Mbits/sec) after compression, on an SSD. The compressed video has been shrunk by nearly a 200:1 ratio before storage, just like the typical 100 Mbit/sec rate used for internal 4K camcorder SDXC card storage. This Atmos compressor recorder is a $1800 devices, and way beyond the notion of a USB class cable / adapter device. Going to 4:4:4 would demand even more bandwidth.

Processing the uncompressed 4K video or doing I/O at this uncompressed rate is VASTLY beyond the capacity of any tablet or other PC input. On the other hand, PLAYING 4K content from an already compressed transport stream file (.mts or .m2ts) or mp4 wrapped file is within the realm of tablets with multicore fast CPUs in GPU boosted tablets, so playback of already compressed content is possible.

Larry
I understand. Maybe things will change in the near future, but that's another story. How I see it now, you're not only unable to record at 4K, you can't go down to 2K from a 4K signal either, in fact that would require even more processing power.

I digress from the original topic but, for the sake of conversation, I'd guess high-end tablets can at least handle data from 2K cameras, right? Wouldn't that would also result in better quality recording? True 1080 lines instead of 800 or so DSLR cameras record internally (unless you're going with Magic Lantern on a Canon).
Given that 2K video with each frame storing 2Mpixels rather than 8 Mpixels found in the 4K format, you are asking a tablet to handle one quarter of 18 Gb/sec, roughly 5 Gb/sec, once again well beyond the sustained processing and I/O rates typical of these type of devices. The compressed video after being encoded to more typical full HD bitrates of about 28 Mbits/sec or less, one quarter of the compressed ultra HD 100 Mbit/sec rate, CAN be handled. So the story for HD is much the same for either format if your goal is to bring uncompressed HDMI vudeo directly into a computer or tablet....neither will work unless an encoder first achieves the approx 200:1 reduction/compression achieved by turning frame by frame rasters into so-called GOPs containing only tnterframe differences.

Cameras which allow "live viewing" on remote devices use downsampling hardware to generate the live real-time stream, or a hard-wired HDMI tether to a monitor. They cannot transmit full bandwidth content through comparatively slow USB ports. Thunderbolt 2 equipped devices are beginning to have adequate speed but are not found on tablets, nor are processors capable of handling such high data rate input.

Encoders for allowing HD input are sold by such companies as Black Magic, AJA, and others, if your goal is to capture HD content for viewing or editing.

Google "Black Magic Intensity Shuttle".
 
Last edited:
Given that 2K video with each frame storing 2Mpixels rather than 8 Mpixels found in the 4K format, you are asking a tablet to handle one quarter of 18 Gb/sec, roughly 5 Gb/sec, once again well beyond the sustained processing and I/O rates typical of these type of devices. The compressed video after being encoded to more typical full HD bitrates of about 28 Mbits/sec or less, one quarter of the compressed ultra HD 100 Mbit/sec rate, CAN be handled. So the story for HD is much the same for either format if your goal is to bring uncompressed HDMI vudeo directly into a computer or tablet....neither will work unless an encoder first achieves the approx 200:1 reduction/compression achieved by turning frame by frame rasters into so-called GOPs containing only tnterframe differences.

Cameras which allow "live viewing" on remote devices use downsampling hardware to generate the live real-time stream, or a hard-wired HDMI tether to a monitor. They cannot transmit full bandwidth content through comparatively slow USB ports. Thunderbolt 2 equipped devices are beginning to have adequate speed but are not found on tablets, nor are processors capable of handling such high data rate input.

Encoders for allowing HD input are sold by such companies as Black Magic, AJA, and others, if your goal is to capture HD content for viewing or editing.

Google "Black Magic Intensity Shuttle".
Yes, it seems there's no other alternative for extrernal capture. I was after freedom and portability, to record on location with the variety of options given by a PC capturing software, and without the need to convert afterwards. I don't want to start a format war but, to me, codecs such as the lossless MagicYUV, and Grass Valley HQX for lossy compression are of high quality, fast, reliable, and support all kinds of resolutions and aspect ratios. But no external device supports them. We're stuck with ProRes and DNxHD, or huge uncompressed RAW files on the $2000+ ones. Maybe I'm asking too much, but the first that came into my mind was to look for a tablet that could handle all this data, because of its ability to install software. Now, I think I'll go for a Sony camera and compromise with the decent yet slow (as an intermediate) XAVCS.

By the way, thank you for the analytical presentation of the issues.
 
Given that 2K video with each frame storing 2Mpixels rather than 8 Mpixels found in the 4K format, you are asking a tablet to handle one quarter of 18 Gb/sec, roughly 5 Gb/sec, once again well beyond the sustained processing and I/O rates typical of these type of devices. The compressed video after being encoded to more typical full HD bitrates of about 28 Mbits/sec or less, one quarter of the compressed ultra HD 100 Mbit/sec rate, CAN be handled. So the story for HD is much the same for either format if your goal is to bring uncompressed HDMI vudeo directly into a computer or tablet....neither will work unless an encoder first achieves the approx 200:1 reduction/compression achieved by turning frame by frame rasters into so-called GOPs containing only tnterframe differences.

Cameras which allow "live viewing" on remote devices use downsampling hardware to generate the live real-time stream, or a hard-wired HDMI tether to a monitor. They cannot transmit full bandwidth content through comparatively slow USB ports. Thunderbolt 2 equipped devices are beginning to have adequate speed but are not found on tablets, nor are processors capable of handling such high data rate input.

Encoders for allowing HD input are sold by such companies as Black Magic, AJA, and others, if your goal is to capture HD content for viewing or editing.

Google "Black Magic Intensity Shuttle".
Yes, it seems there's no other alternative for extrernal capture. I was after freedom and portability, to record on location with the variety of options given by a PC capturing software, and without the need to convert afterwards. I don't want to start a format war but, to me, codecs such as the lossless MagicYUV, and Grass Valley HQX for lossy compression are of high quality, fast, reliable, and support all kinds of resolutions and aspect ratios. But no external device supports them. We're stuck with ProRes and DNxHD, or huge uncompressed RAW files on the $2000+ ones. Maybe I'm asking too much, but the first that came into my mind was to look for a tablet that could handle all this data, because of its ability to install software. Now, I think I'll go for a Sony camera and compromise with the decent yet slow (as an intermediate) XAVCS.

By the way, thank you for the analytical presentation of the issues.
Glad to offer my assistance!

The real form at war which I personally regret is HEVC versus XAVC. Both are being used currently on 4K cameras, and each has its advantages and disadvantages. Until such time as the standard for new ultra high definition 4K blu-ray discs is released later this year, it is unclear whether both will survive.

In the interim, those of us with 4K cameras may eventually regret the choices we have made.
 
those of us with 4K cameras may eventually regret the choices we have made.
Yeah, and I better just sit this ongoing "4K battle" out, and wait a bit until the "8K video battle" commences in earnest. Won't be long now. ;-)

As an aside, I just received my 2 identical 4K monitors, and was presently surprised that I can use it in user defined but true ITU-R Recommendation BT.2020 mode. The color spectrum in this mode is far better than we have with ITU-R-709 color space for HD material.
 
Given that 2K video with each frame storing 2Mpixels rather than 8 Mpixels found in the 4K format, you are asking a tablet to handle one quarter of 18 Gb/sec, roughly 5 Gb/sec, once again well beyond the sustained processing and I/O rates typical of these type of devices. The compressed video after being encoded to more typical full HD bitrates of about 28 Mbits/sec or less, one quarter of the compressed ultra HD 100 Mbit/sec rate, CAN be handled. So the story for HD is much the same for either format if your goal is to bring uncompressed HDMI vudeo directly into a computer or tablet....neither will work unless an encoder first achieves the approx 200:1 reduction/compression achieved by turning frame by frame rasters into so-called GOPs containing only tnterframe differences.

Cameras which allow "live viewing" on remote devices use downsampling hardware to generate the live real-time stream, or a hard-wired HDMI tether to a monitor. They cannot transmit full bandwidth content through comparatively slow USB ports. Thunderbolt 2 equipped devices are beginning to have adequate speed but are not found on tablets, nor are processors capable of handling such high data rate input.

Encoders for allowing HD input are sold by such companies as Black Magic, AJA, and others, if your goal is to capture HD content for viewing or editing.

Google "Black Magic Intensity Shuttle".
Yes, it seems there's no other alternative for extrernal capture. I was after freedom and portability, to record on location with the variety of options given by a PC capturing software, and without the need to convert afterwards. I don't want to start a format war but, to me, codecs such as the lossless MagicYUV, and Grass Valley HQX for lossy compression are of high quality, fast, reliable, and support all kinds of resolutions and aspect ratios. But no external device supports them. We're stuck with ProRes and DNxHD, or huge uncompressed RAW files on the $2000+ ones. Maybe I'm asking too much, but the first that came into my mind was to look for a tablet that could handle all this data, because of its ability to install software. Now, I think I'll go for a Sony camera and compromise with the decent yet slow (as an intermediate) XAVCS.

By the way, thank you for the analytical presentation of the issues.
Glad to offer my assistance!

The real form at war which I personally regret is HEVC versus XAVC. Both are being used currently on 4K cameras, and each has its advantages and disadvantages. Until such time as the standard for new ultra high definition 4K blu-ray discs is released later this year, it is unclear whether both will survive.

In the interim, those of us with 4K cameras may eventually regret the choices we have made.
The standard is already set and "Ultra HD Blu-ray will use the H.265 codec and ship on double and triple-layer discs of 66GB and 100GB capacities, respectively."

XAVC is a codec or wrapper invented by Sony (a variant of H264). It is not a standard. HEVC (H265) is a standard. It is now supported natively by Windows 10; it is used by Netflix; H265 is supported on many 4K TV's that do not play XAVC files.
 
Last edited:
Glad to offer my assistance!

The real form at war which I personally regret is HEVC versus XAVC. Both are being used currently on 4K cameras, and each has its advantages and disadvantages. Until such time as the standard for new ultra high definition 4K blu-ray discs is released later this year, it is unclear whether both will survive.

In the interim, those of us with 4K cameras may eventually regret the choices we have made.
The standard is already set and "Ultra HD Blu-ray will use the H.265 codec and ship on double and triple-layer discs of 66GB and 100GB capacities, respectively."

XAVC is a codec or wrapper invented by Sony (a variant of H264). It is not a standard. HEVC (H265) is a standard. It is now supported natively by Windows 10; it is used by Netflix; H265 is supported on many 4K TV's that do not play XAVC files.
Does it really matter if the codec you're recording with is an industry standard or not, as long as you fulfill this requirement when delivering?
 
Glad to offer my assistance!

The real form at war which I personally regret is HEVC versus XAVC. Both are being used currently on 4K cameras, and each has its advantages and disadvantages. Until such time as the standard for new ultra high definition 4K blu-ray discs is released later this year, it is unclear whether both will survive.

In the interim, those of us with 4K cameras may eventually regret the choices we have made.
The standard is already set and "Ultra HD Blu-ray will use the H.265 codec and ship on double and triple-layer discs of 66GB and 100GB capacities, respectively."

XAVC is a codec or wrapper invented by Sony (a variant of H264). It is not a standard. HEVC (H265) is a standard. It is now supported natively by Windows 10; it is used by Netflix; H265 is supported on many 4K TV's that do not play XAVC files.
Does it really matter if the codec you're recording with is an industry standard or not, as long as you fulfill this requirement when delivering?H Well, yes and no.
Well, yes and no.

XAVC is an h.264 AVC format such as AVCHD, and has become standardized in virtually all video encoding used worldwide in cameras, camcorders, broadcast video, and satellite transmission. The newest flavor, XAVC, supports 4K UHD, and is already heavily in use. One could imagine this form of encoding in future 4K UHD Blu-ray even though HEVC, with nearly twice the efficiency has been named as a codec supported in the new disks.

When the Blu-ray spec was originally developed, a similar situation existed. The newer codec, AVC, offered efficiency advantages, whereas the popularity of MPEG-2 and HDV video were less efficient but very well-established. At that time, Microsoft' Windows Media Video codec, VC-1, was also included in the standard, not because it was especially popular, but because Microsoft was represented well on the standards committee and exerted considerable influence.

It is therefore entirely possible that both HEVC (h.265) as well as AVC (h.264) will BOTH be supported on the new 4K BluRay disks. In doing so, the author can choose and the player will abide.

Avoiding the need to recompress from one 4K format from my XAVC cameras to another used to author, HEVC, is what I want to entirely avoid. The time involved as well as the quality penalty make this a very unattractive prospect for early 4K adopters whose cameras only can create the older AVC content.

Larry
 
Last edited:
"It is therefore entirely possible that both HEVC (h.265) as well as AVC (h.264) will BOTH be supported on the new 4K BluRay disks. In doing so, the author can choose and the player will abide."

No, the standards for 4K UHD bluray are already set - it is HEVC, not XAVC. They never changed the old bluray standards, and they are unlikely to change the new one either. It is done.
 
Does it really matter if the codec you're recording with is an industry standard or not, as long as you fulfill this requirement when delivering?
Well, yes and no.

XAVC is an h.264 AVC format such as AVCHD, and has become standardized in virtually all video encoding used worldwide in cameras, camcorders, broadcast video, and satellite transmission. The newest flavor, XAVC, supports 4K UHD, and is already heavily in use. One could imagine this form of encoding in future 4K UHD Blu-ray even though HEVC, with nearly twice the efficiency has been named as a codec supported in the new disks.

When the Blu-ray spec was originally developed, a similar situation existed. The newer codec, AVC, offered efficiency advantages, whereas the popularity of MPEG-2 and HDV video were less efficient but very well-established. At that time, Microsoft' Windows Media Video codec, VC-1, was also included in the standard, not because it was especially popular, but because Microsoft was represented well on the standards committee and exerted considerable influence.

It is therefore entirely possible that both HEVC (h.265) as well as AVC (h.264) will BOTH be supported on the new 4K BluRay disks. In doing so, the author can choose and the player will abide.

Avoiding the need to recompress from one 4K format from my XAVC cameras to another used to author, HEVC, is what I want to entirely avoid. The time involved as well as the quality penalty make this a very unattractive prospect for early 4K adopters whose cameras only can create the older AVC content.

Larry
But don't you recompress after editing anyway? Or you mean that XAVC to HEVC looks worse than HEVC to HEVC? If that's the case, besides avoid compression, optimal quality can only be achieved if the codecs match throughout the workflow chain.
 
Does it really matter if the codec you're recording with is an industry standard or not, as long as you fulfill this requirement when delivering?
Well, yes and no.

XAVC is an h.264 AVC format such as AVCHD, and has become standardized in virtually all video encoding used worldwide in cameras, camcorders, broadcast video, and satellite transmission. The newest flavor, XAVC, supports 4K UHD, and is already heavily in use. One could imagine this form of encoding in future 4K UHD Blu-ray even though HEVC, with nearly twice the efficiency has been named as a codec supported in the new disks.

When the Blu-ray spec was originally developed, a similar situation existed. The newer codec, AVC, offered efficiency advantages, whereas the popularity of MPEG-2 and HDV video were less efficient but very well-established. At that time, Microsoft' Windows Media Video codec, VC-1, was also included in the standard, not because it was especially popular, but because Microsoft was represented well on the standards committee and exerted considerable influence.

It is therefore entirely possible that both HEVC (h.265) as well as AVC (h.264) will BOTH be supported on the new 4K BluRay disks. In doing so, the author can choose and the player will abide.

Avoiding the need to recompress from one 4K format from my XAVC cameras to another used to author, HEVC, is what I want to entirely avoid. The time involved as well as the quality penalty make this a very unattractive prospect for early 4K adopters whose cameras only can create the older AVC content.

Larry
But don't you recompress after editing anyway? Or you mean that XAVC to HEVC looks worse than HEVC to HEVC? If that's the case, besides avoid compression, optimal quality can only be achieved if the codecs match throughout the workflow chain.
You are correct. Whatever the codec of the clips you create in camera, you will edit them and have to render and thus re-compress. There is no difference from rendering from XAVC to H265 or from H265 to H265 unless the original clips shot with one or the other are better.

The point is that HEVC is a much more efficient codec than H264, enabling smaller files. It does take more computing power to encode H265, but new chips will embed the encoding code in them so that the process will be much faster in the future.

The point is that what the bluray standard is should not dictate what camera you get or what codec you use for shooting. One can have confidence, however, that there will be support - software and hardware - for H265 in the foreseeable future and you can take advantage of its greater efficiency.
 
Does it really matter if the codec you're recording with is an industry standard or not, as long as you fulfill this requirement when delivering?
Well, yes and no.

XAVC is an h.264 AVC format such as AVCHD, and has become standardized in virtually all video encoding used worldwide in cameras, camcorders, broadcast video, and satellite transmission. The newest flavor, XAVC, supports 4K UHD, and is already heavily in use. One could imagine this form of encoding in future 4K UHD Blu-ray even though HEVC, with nearly twice the efficiency has been named as a codec supported in the new disks.

When the Blu-ray spec was originally developed, a similar situation existed. The newer codec, AVC, offered efficiency advantages, whereas the popularity of MPEG-2 and HDV video were less efficient but very well-established. At that time, Microsoft' Windows Media Video codec, VC-1, was also included in the standard, not because it was especially popular, but because Microsoft was represented well on the standards committee and exerted considerable influence.

It is therefore entirely possible that both HEVC (h.265) as well as AVC (h.264) will BOTH be supported on the new 4K BluRay disks. In doing so, the author can choose and the player will abide.

Avoiding the need to recompress from one 4K format from my XAVC cameras to another used to author, HEVC, is what I want to entirely avoid. The time involved as well as the quality penalty make this a very unattractive prospect for early 4K adopters whose cameras only can create the older AVC content.

Larry
But don't you recompress after editing anyway? Or you mean that XAVC to HEVC looks worse than HEVC to HEVC? If that's the case, besides avoid compression, optimal quality can only be achieved if the codecs match throughout the workflow chain.
last i heard, if you can't match aquisition/distribution codecs/bitrates, footage in an editing system is uncompressed before it gets re-compressed for output, with a generational loss.

if you want to dump 4k straight to bluray for distribution, only re-encoding things like titles and transitions, not sure that it can happen when there isn't a matching bluray codec standard... unless some players support more than just the minimal standards, which did happen with current bluray players.

of course that's not an option for pro-level bluray distribution... kind of a moot argument, since physical media has been dying for years now.

the better question might be, will youtube accept raw camera hevc uploads?
 

Keyboard shortcuts

Back
Top