{"id":130428,"date":"2021-02-12T21:24:10","date_gmt":"2021-02-12T21:24:10","guid":{"rendered":"https:\/\/developer.apple.com\/news\/?id=rwbholxw"},"modified":"2021-02-12T21:24:10","modified_gmt":"2021-02-12T21:24:10","slug":"support-hdr-video-playback-editing-and-export-in-your-app","status":"publish","type":"post","link":"https:\/\/sickgaming.net\/blog\/2021\/02\/12\/support-hdr-video-playback-editing-and-export-in-your-app\/","title":{"rendered":"Support HDR video playback, editing, and export in your app"},"content":{"rendered":"<div class=\"inline-article-image\"><img decoding=\"async\" src=\"https:\/\/www.sickgaming.net\/blog\/wp-content\/uploads\/2022\/12\/support-hdr-video-playback-editing-and-export-in-your-app.jpg\" data-hires=\"false\" alt=\"Four film strip icons with a pencil, arrows, play button, and star button on them, on a blue\/green background.\"><\/div>\n<p>You can help people create more vivid and true-to-life video when you support high dynamic range (HDR) in your app. And when you support HDR with Dolby Vision, people with iPhone 12 or iPhone 12 Pro can go even further and shoot, edit, and play cinema-grade videos right from their device. Dolby Vision tuning is provided dynamically to each frame, preserving the intended look of the original shots.<\/p>\n<p>Here\u2019s how you can provide the best HDR video playback, editing, and export experience.<\/p>\n<h3>Get started with HDR video<\/h3>\n<p>Your app needs to support iOS 14.1 or later to take advantage of HDR video. To begin, we recommend reviewing a few WWDC sessions which provide a good overview of the process.<\/p>\n<section class=\"grid activity\">\n<section class=\"row\">\n<section class=\"column large-4 small-4 no-padding-top no-padding-bottom\"> <a href=\"https:\/\/developer.apple.com\/wwdc20\/10010\" class=\"activity-image-link\"> <img decoding=\"async\" class=\"actiity-image medium-scale\" width=\"250\" src=\"https:\/\/www.sickgaming.net\/blog\/wp-content\/uploads\/2022\/12\/support-hdr-video-playback-editing-and-export-in-your-app-1.jpg\" data-hires=\"false\" alt> <\/a> <\/section>\n<section class=\"column large-8 small-8 padding-left-small padding-top-small padding-bottom-small no-padding-top no-padding-bottom\"> <a href=\"https:\/\/developer.apple.com\/wwdc20\/10010\"> <\/p>\n<h4 class=\"no-margin-bottom activity-title\">Export HDR media in your app with AVFoundation<\/h4>\n<p class=\"activity-description\">Discover how to author and export high dynamic range (HDR) content in your app using AVFoundation. Learn about high dynamic range and how you can take advantage of it in your app. We\u2019ll show you how to implement feature sets that allow people to export HDR content, go over supported HDR formats,&#8230;<\/p>\n<p> <\/a> <\/section>\n<\/section>\n<\/section>\n<section class=\"grid activity\">\n<section class=\"row\">\n<section class=\"column large-4 small-4 no-padding-top no-padding-bottom\"> <a href=\"https:\/\/developer.apple.com\/wwdc20\/10009\" class=\"activity-image-link\"> <img decoding=\"async\" class=\"actiity-image medium-scale\" width=\"250\" src=\"https:\/\/www.sickgaming.net\/blog\/wp-content\/uploads\/2022\/12\/support-hdr-video-playback-editing-and-export-in-your-app-2.jpg\" data-hires=\"false\" alt> <\/a> <\/section>\n<section class=\"column large-8 small-8 padding-left-small padding-top-small padding-bottom-small no-padding-top no-padding-bottom\"> <a href=\"https:\/\/developer.apple.com\/wwdc20\/10009\"> <\/p>\n<h4 class=\"no-margin-bottom activity-title\">Edit and play back HDR video with AVFoundation<\/h4>\n<p class=\"activity-description\">Find out how you can support HDR editing and playback in your macOS app, and how you can determine if a specific hardware configuration is eligible for HDR playback. We&#8217;ll show you how to use AVMutableVideoComposition with the built-in compositor and easily edit HDR content, explain how you can use&#8230;<\/p>\n<p> <\/a> <\/section>\n<\/section>\n<\/section>\n<hr>\n<p><em>Note: iPhone 12 and iPhone 12 Pro record HDR video in Dolby Vision Profile 8.4, Cross-compatibility ID 4 (HLG) format, using an HEVC (10-bit) codec. This format is designed to be backwards compatible with HLG, allowing existing HEVC decoders to decode as HLG. Video is recorded by the Camera app as a QuickTime File Format (QTFF) movie (.mov extension). Signaling for Dolby Vision in a QTFF movie is similar to signaling in Dolby Vision Streams within the ISO base media file format.<\/em><\/p>\n<hr>\n<p><a href=\"https:\/\/sforce.co\/3clmE2M\" class=\"icon icon-after icon-chevronright\">Learn more about Dolby Vision Profiles<\/a><\/p>\n<p><a href=\"https:\/\/sforce.co\/2NAaF6X\" class=\"icon icon-after icon-chevronright\">Learn more about Dolby Vision Levels<\/a><\/p>\n<p><a href=\"https:\/\/dolby.force.com\/professionalsupport\/s\/article\/How-to-signal-Dolby-Vision-in-ISOBMFF-format-AKA-mp4-container?language=en_US\" class=\"icon icon-after icon-chevronright\">Learn more about Dolby Vision Streams<\/a><\/p>\n<h3>Support HDR video playback in your app<\/h3>\n<p>Both iOS and macOS support HDR video playback on all eligible devices. Use <code>eligibleForHDRPlayback<\/code> on <code>AVPlayer<\/code> to check for HDR playback support on the current device. In general, the classes <code>AVPlayer<\/code>, <code>AVPlayerlayer<\/code>, or <code>AVSampleBufferDisplayLayer<\/code> can be used to play Dolby Vision video. If your app uses <code>AVPlayer<\/code>, you don\u2019t need to add anything additional to your code: The <code>AVFoundation<\/code> framework automatically sets up an HDR playback pipeline to handle Dolby Vision Profile 8.4 if it detects an asset in Dolby Vision and the device supports HDR playback.<\/p>\n<p>If your app uses<code>AVSampleBufferDisplayLayer<\/code> to render video, make sure any sample buffers passed to the sample buffer display layer are in formats suitable for HDR and carry Dolby Vision Profile 8.4 per-frame metadata. These sample buffers need to have 10-bit or higher bit-depth. A commonly used 10-bit format is 4:2:0 Y\u2019CbCr video range, represented by <code>kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange<\/code>. The associated <code>OSType<\/code> for this pixel format is <code>\u2019x420\u2019<\/code>.<\/p>\n<p>If your sample buffers are decoded using <code>VTDecompressionSession<\/code>, you can carry the Dolby Vision Profile 8.4 per-frame metadata in the buffers by using <code>kVTDecompressionPropertyKey_PropagatePerFrameHDRDisplayMetadata<\/code>. This value is true by default.<\/p>\n<p><strong>Asset inspection<\/strong><br \/>\n<code>AVMediaCharacteristic<\/code> provides options for specifying media type characteristics, including whether a video includes HDR metadata. You can use the Swift media characteristic <code>containsHDRVideo<\/code> to identify whether any segment of a track contains HDR so that your app can render it correctly. In Objective-C, you can use <code>AVMediaCharacteristicContainsHDRVideo<\/code>, defined in <code>AVMediaFormat.h<\/code>. <\/p>\n<p>After loading the tracks property using the Swift method <code>loadValuesAsynchronously(forKeys:completionHandler:)<\/code>, you can get HDR tracks using <code>tracks(withMediaCharacteristic:)<\/code>. Here\u2019s how you might get all desired HDR tracks:<\/p>\n<pre class=\"code-source\"><code><span class=\"syntax-keyword\">let<\/span> hdrTracks <span class=\"syntax-operator\">=<\/span> asset.tracks(withMediaCharacteristic: .containsHDRVideo)<\/code><\/pre>\n<p>In a similar fashion, you can use the Objective-C method <code>loadValuesAsynchronouslyForKeys:completionHandler:<\/code> to load the tracks property and obtain the HDR tracks with the method <code>tracksWithMediaCharacteristic:<\/code>, like so:<\/p>\n<pre class=\"code-source\"><code><span class=\"syntax-built_in\">NSArray<\/span>&lt;<span class=\"syntax-built_in\">AVAssetTrack<\/span> *&gt; *hdrTracks = [asset tracksWithMediaCharacteristic:<span class=\"syntax-built_in\">AVMediaCharacteristicContainsHDRVideo<\/span>];<\/code><\/pre>\n<p>The <code>hasMediaCharacteristic(_:)<\/code> method can be used to track media characteristics, such as HDR media type, format descriptions, or explicit tagging. For example:<\/p>\n<pre class=\"code-source\"><code><span class=\"syntax-keyword\">if<\/span> track.hasMediaCharacteristic(.containsHDRVideo){ }<\/code><\/pre>\n<p>In Objective-C, you can use the same <code>hasMediaCharacteristic:<\/code> method for explicit tagging, as demonstrated here:<\/p>\n<pre class=\"code-source\"><code><span class=\"syntax-keyword\">if<\/span>([track hasMediaCharacteristic:<span class=\"syntax-built_in\">AVMediaCharacteristicContainsHDRVideo<\/span>]){ }<\/code><\/pre>\n<h3>Support HDR video editing and previewing in your app<\/h3>\n<p>To add HDR content editing to your application, use <code>AVVideoComposition<\/code>. If you\u2019re using the built-in compositor, you can also use the Swift initializer <code>init(asset:applyingCIFiltersWithHandler:)<\/code> or the Objective-C initializer <code>videoCompositionWithAsset:applyingCIFiltersWithHandler:<\/code> with built-in <code>CIFilters<\/code> to easily incorporate an HDR editing pipeline in your app.<\/p>\n<p>Custom compositors can support HDR content, too: You can use the <code>supportsHDRSourceFrames<\/code> property to indicate HDR capability. For Objective-C, the <code>supportsHDRSourceFrames<\/code> property is a part of the <code>AVVideoCompositing<\/code> protocol defined in <code>AVVideoCompositing.h<\/code>.<\/p>\n<p>If your custom compositor needs to operate in 10-bit HDR pixel formats, you\u2019ll need to select pixel buffer attributes that your compositor can accept as input by using the <code>sourcePixelBufferAttributes<\/code> property. For Objective-C, this property is found in <code>AVVideoCompositing.h<\/code>. The value of this property is a dictionary which contains pixel buffer attribute keys defined in <code>CoreVideo<\/code> header file <code>CVPixelBuffer.h<\/code>.<\/p>\n<p>Additionally, to create new buffers for processing, you\u2019ll need the correct pixel buffer attributes required by the video compositor. For this particular purpose, use the property <code>requiredPixelBufferAttributesForRenderContext<\/code>.<\/p>\n<p>If your app offers video previewing during editing, modifying the pixel values may invalidate the video\u2019s existing dynamic metadata and its usage. Because the Dolby Vision Profile 8.4 metadata is completely transparent, you can use <code>AVPlayerItem<\/code> to drop any invalid metadata during preview-only scenarios, as well as update dynamic metadata during export to reflect changes in the video content. <\/p>\n<p>To configure HDR settings, you can use the <code>appliesPerFrameHDRDisplayMetadata<\/code> property from <code>AVPlayerItem<\/code> , which defaults to true. In Objective-C, the property defaults to YES and can be found in <code>AVPlayerItem.h<\/code>.<\/p>\n<p>By default, <code>AVFoundation<\/code> will attempt to use Dolby Vision metadata if present for a video, but you can tell your app to ignore it: Just set the <code>appliesPerFrameHDRDisplayMetadata<\/code> property from <code>AVPlayerItem<\/code> to false in Swift, or NO in Objective-C. If your application is using <code>VTDecompressionSession<\/code> APIs from <code>VideoToolbox<\/code>, you can turn off Dolby Vision tone mapping with <code>kVTDecompressionPropertyKey_PropagatePerFrameHDRDisplayMetadata<\/code>. To use this property in C or Objective-C, make sure to include VideoToolbox in the framework header <code>VTDecompressionProperties.h<\/code>.<\/p>\n<h3>Support HDR export in your app<\/h3>\n<p>You can support HDR video export in your app when you work with <code>AVAssetWriter<\/code> and HEVC presets.<\/p>\n<p><strong>Discover presets and AVAssetExportSession<\/strong><br \/>\nAll HEVC presets have been upgraded to support HDR. The output format will match the source format, so if the source file is Dolby Vision Profile 8.4, the exported movie will also be Dolby Vision Profile 8.4. If you need to change the output format, you can use <code>AVAssetWriter<\/code>.<\/p>\n<hr>\n<p><em>Note: H.264 presets will convert HDR to Standard Dynamic Range (SDR).<\/em><\/p>\n<hr>\n<p>In order to preserve Dolby Vision Profile 8.4 during export using <code>AVAssetWriter<\/code>, you must choose a suitable output format, color properties that support Dolby Vision, and a 10-bit profile level. <\/p>\n<p>To start, note that querying <code>supportedOutputSettingsKeys(for:)<\/code> in Swift or <code>supportedOutputSettingsKeysForConnection:<\/code> in Objective-C provides a list of output settings keys supported for the current device. <\/p>\n<p>For Dolby Vision export, the video output settings dictionary key <code>AVVideoCompressionPropertiesKey<\/code> allows you to control bit rate, B-frame delivery, I-frame interval, and codec quality. The value associated with this key is an instance of <code>NSDictionary<\/code>. For Objective-C, this key is found in <code>AVVideoSettings.h<\/code>.<\/p>\n<p>For example, a video output settings dictionary for Dolby Vision in Swift would contain these key\/value pairs:<\/p>\n<pre class=\"code-source\"><code><span class=\"syntax-keyword\">let<\/span> videoOutputSettings: [<span class=\"syntax-type\">String<\/span>: <span class=\"syntax-keyword\">Any<\/span>] <span class=\"syntax-operator\">=<\/span> [ <span class=\"syntax-type\">AVVideoCodecKey<\/span>: <span class=\"syntax-type\">AVVideoCodecType<\/span>.hevc, <span class=\"syntax-type\">AVVideoProfileLevelKey<\/span>: kVTProfileLevel_HEVC_Main10_AutoLevel, <span class=\"syntax-type\">AVVideoColorPropertiesKey<\/span>: [ <span class=\"syntax-type\">AVVideoColorPrimariesKey<\/span>: <span class=\"syntax-type\">AVVideoColorPrimaries_ITU_R_2020<\/span>, <span class=\"syntax-type\">AVVideoTransferFunctionKey<\/span>: <span class=\"syntax-type\">AVVideoTransferFunction_ITU_R_2100_HLG<\/span>, <span class=\"syntax-type\">AVVideoYCbCrMatrixKey<\/span>: <span class=\"syntax-type\">AVVideoYCbCrMatrix_ITU_R_2020<\/span> ], <span class=\"syntax-type\">AVVideoCompressionPropertiesKey<\/span>: [ kVTCompressionPropertyKey_HDRMetadataInsertionMode: kVTHDRMetadataInsertionMode_Auto ]\n]<\/code><\/pre>\n<p>With Objective-C, your video output settings dictionary would contain the same key\/value pairs:<\/p>\n<pre class=\"code-source\"><code><span class=\"syntax-built_in\">NSDictionary<\/span>&lt;<span class=\"syntax-built_in\">NSString<\/span>*, <span class=\"syntax-type\">id<\/span>&gt;* videoOutputSettings = @{ <span class=\"syntax-built_in\">AVVideoCodecKey<\/span>: <span class=\"syntax-built_in\">AVVideoCodecTypeHEVC<\/span>, <span class=\"syntax-built_in\">AVVideoProfileLevelKey<\/span>: (__bridge <span class=\"syntax-built_in\">NSString<\/span>*)kVTProfileLevel_HEVC_Main10_AutoLevel, <span class=\"syntax-built_in\">AVVideoColorPropertiesKey<\/span>: @{ <span class=\"syntax-built_in\">AVVideoColorPrimariesKey<\/span>: <span class=\"syntax-built_in\">AVVideoColorPrimaries_ITU_R_2020<\/span>, <span class=\"syntax-built_in\">AVVideoTransferFunctionKey<\/span>: <span class=\"syntax-built_in\">AVVideoTransferFunction_ITU_R_2100_HLG<\/span>, <span class=\"syntax-built_in\">AVVideoYCbCrMatrixKey<\/span>: <span class=\"syntax-built_in\">AVVideoYCbCrMatrix_ITU_R_2020<\/span> }, <span class=\"syntax-built_in\">AVVideoCompressionPropertiesKey<\/span>: @{ (__bridge <span class=\"syntax-built_in\">NSString<\/span>*)kVTCompressionPropertyKey_HDRMetadataInsertionMode: (__bridge <span class=\"syntax-built_in\">NSString<\/span>*)kVTHDRMetadataInsertionMode_Auto }\n};<\/code><\/pre>\n<p>In Objective-C, the key <code>kVTCompressionPropertyKey_HDRMetadataInsertionMode<\/code> and the value <code>kVTHDRMetadataInsertionMode_Auto<\/code> are found in <code>VTDecompressionProperties.h<\/code>.<\/p>\n<p>In addition to defining key\/value pairs, make sure that the pixel buffers presented to <code>AVAssetWriterInput<\/code> are a 10-bit 4:2:0 Y\u2019CbCr video range represented by &#8216;x420&#8217; OSType.<\/p>\n<p>You may elect to use a separate <code>AVAssetReader<\/code> or <code>AVAssetWriter<\/code> model for export. In that case, you can use the VideoToolbox property <code>kVTCompressionPropertyKey_PreserveDynamicHDRMetadata<\/code> and set it to kCFBooleanFalse or false for C\/Objective-C or Swift respectively. When you set the VideoToolbox property, <code>AVAssetWriter<\/code> will automatically recompute the Dolby Vision Profile 8.4 metadata for exporting the file. This should be done as your app modifies the output frames from the <code>AVAssetReader<\/code>.<\/p>\n<h3>Resources<\/h3>\n<p><a href=\"https:\/\/developer.apple.com\/av-foundation\/\" class=\"icon icon-after icon-chevronright\">Learn more about AVFoundation<\/a><\/p>\n<p><a href=\"https:\/\/developer.apple.com\/documentation\/avfoundation\" class=\"icon icon-after icon-chevronright\">AVFoundation<\/a><\/p>\n<p><a href=\"https:\/\/developer.apple.com\/documentation\/videotoolbox\" class=\"icon icon-after icon-chevronright\">Video Toolbox<\/a><\/p>\n<p><a href=\"https:\/\/sforce.co\/3clmE2M\" class=\"icon icon-after icon-chevronright\">Learn more about Dolby Vision Profiles<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>You can help people create more vivid and true-to-life video when you support high dynamic range (HDR) in your app. And when you support HDR with Dolby Vision, people with iPhone 12 or iPhone 12 Pro can go even further and shoot, edit, and play cinema-grade videos right from their device. Dolby Vision tuning is [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":130429,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[55],"tags":[],"class_list":["post-130428","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-apple-developer-news"],"_links":{"self":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/130428","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/comments?post=130428"}],"version-history":[{"count":0,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/130428\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/media\/130429"}],"wp:attachment":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/media?parent=130428"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/categories?post=130428"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/tags?post=130428"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}