Daily Videos on Holidays with iMovie and Final Cut Pro

Well, I’ve already written how you can edit and shoot an entire wedding video on the iPhone 12 Pro Max with the included iMovie, but just this week, I did a series of “dailies” from a hiking trip that went really well. We were completely off the grid, so all the photos had to be gathered by Airdrop. This worked super well, I’m amazed at how reliable Apple has made this, but with all phones that included an iPhone 6, 6s, 11, 12 Pro Max, and 13 Pro, we were able to just turn on Bluetooth and WiFi and sharing was basically instant as long as you said: “Anyone could connect.”

Step 1. Gather the photos and videos via Airdrop and SD cards

It is not super intuitive to most people that Airdrop needs both Bluetooth and WiFi, but for tech geeks, it makes sense, it used Bluetooth to send closeness and then sets up an ad hoc and invisible WiFi network to transfer 100 or more photos in less and 15 seconds.

We were also able to gather non-Apple device information. I’m really happy the Apple MacBook Pro 14 has an SD slot, I had thought this was silly, but in these circumstances, it made it easy to collect data from two Canon DSLRs.

The only glitch was actually getting photos from an Android phone. I had thought that with Android Studio this would be easy, but it turns out you actually have to download the SDK, so next time I’m going to have Android File Transfer ready so I can just plug it into the USB C port. We actually did try to get data via an Amazon Kindle but that was a real failure.

Step 2. Choosing between an iPad and a MacBook depends on the music

OK, then the question is whether to use the iPad to do all this or a MacBook. Turns out this all depends so just one question, did you download music? This is because music really makes a video great and in our case, two years ago, we used an iPad with iMovie to great effect, but this time, I forgot to download music, but fortunately, I had some real MP3 on the MacBook so that is what we used.

The main lesson here is that music really makes a slideshow great, so we went with the MacBook for this time.

Step 3. Choosing between Photo Slideshow, iMovie, and Final Cut Pro for quick plays

Well, here is one of the strange things about Photos is that it has a great Slideshow mode, you access by creating an Album (not a Shared Album that doesn’t work!). And then in the upper right, you click on Slideshow and then you can select Music and it really works well. The main issue is that there is actually no way to edit or save anything, but it is super fast. And here you have to have Apple Music downloaded content to play anything.

So, if doesn’t work and you only need 1080p SDR movies, then iMovies is the next simplest thing. It is bundled and you have to make sure it is loaded and then is pretty perfect. The main thing is that when you switch it to Ken Burns effects sometimes, it does the wrong thing, but it’s easy to go through and fix it. It also has decent titles. It was terrific to be able to go through and edit a bunch and then have a movie of 8 minutes or so ready to go.

Step 4. If you want HDR and 4K movies then Final Cut Pro

Well, this is all well and good but for the final result, you might want to take advantage of the 4K HDR content that you were hopefully shooting with your iPhone 13 Pro in Cinematic Video mode or the HDR photos you were taking. Then things get more involved and hopefully, you have Final Cut Pro around. It is the big brother to iMovie and in fact, you can take Events that you have in iMovie and move them over.

But there are some real gotchas here that are not that clear, so for example:

  1. If you want to shoot an HDR movie, you have to create an entire new Library setup for Wide Gamut HDR. So, you need to create an entire new Library with File/New Library and make sure to pick HDR in the format.
  2. Then when you create a new Project, you will get a dizzying array of choices from Rec. 2020 to Rec. 2020 PQ to Rec. 2020 HLG. The short answer is that all Apple devices understand HLG (Hybrid Log Gamma) which means that you can see something decent on either HDR or SDR playback devices, so this is a good choice. Basically, Dolby Perceptual Quantization Curve (PQ) is what is part of HDR 10 and Dolby Vision. The big difference between HDR10 and Dolby Vision is that the latter has dynamic metadata so you can dynamically change every scene what the encoding is. The Hybrid Log Gamma is also static so there is less encoding work and tries to be a good compromise that looks at various parameters of each frame and does an adjustment.
  3. Now the second confusing thing is how to get all the photos into the system. The Browser at the upper right is confusing because, in the first pane, you get the Event (in this system all Projects are part of events), but there is a completely separate pane that shows you all the photos and videos in Apple Photos, so what you have to do is to drag the pictures you want and then put them on the title of the event pane, wait and then this will show up and you can drag it to the event that you want. Pretty confusing interface as there is no right click to move to or from.
  4. Now the Ken Burns effect is really buried in Final Cut Pro (it’s kind of a quick gimmick after all, but makes photos look good). So what you have to do is to go to the Timeline at the bottom and then in the Viewer which is normally at the stop select the Crop or this Shift-C and then the Ken burns is supposed to come up as an option. At least for me on Final Cut Pro as of September 2022, this didn’t work that well, so instead, I selected the Image, and then in the upper right, there is a film icon that is the Video Inspector and one of the properties is Crop and the Type becomes Ken Burns. The thing is pretty smart but you do have to adjust it. One annoying thing is that for selfies, sometimes, the camera has it upside-down, even if you use the Transform to Rorate, it still shows upside down in the main viewer.

The Confusion that is HDR

OK one important note is that HDR is a real mess inside Final Cut Pro especially when you are integrating SDR (Standard Dynamic Range) with HDR (High Dynamic Range content). SDR has a limit of 100 nits. As a refresher, a nit is the total amount of light emitted by a single candle that shines on a square meter (technically $1 cd div m^{2}$). So it’s pretty dim. A movie theater for instance has a maximum illumination of 50 nits (that is at the brightest scene), you get the equivalent of 50 candles per square meter of the screen. In contrast, lumens is the amount a projector emits. An SDR has a maximum brightness of 100 nits, but an iPhone 14 Pro with its OLED screen has a peak HDR brightness of 1600 nits and the iPhone 13 Pro is 1200 nits. (Outdoors, the iPhone 14 Pro can go up to 2000 nits which is very welcome on these bright climate-change days). And an LG C9 OLED has a brightness of 780 nits, so not as bright as an Samsung QLED. Also note that it depends on how much of the screen is bright, so for instance, an LG C1 is 799 nits with 2% of the screen bright but 146 nits with the entire screen on at 100%.

So what are the color spaces that are used in say a standard iPhone or a Canon camera:

  1. So, if you are shooting HDR content with say Cinematic Video on an iPhone you are using HLG
  2. The final problem is that when you do this, you are probably going to be mixing videos that are SDR content (technically call BT.709 as video)
  3. In stills, it is sRGB or AdobeRGB) for a Canon. For an iPhone it is P3
  4. Tutorials say you have to go to color correction and crank up the brightness which basically creates a pseudo-HDR image that looks a unnatural
  5. HDR content when the iPhone cameras decide to shoot in HDR format (really HLG format) so at least on some of the images I took, the non-HDR content looked very, very dark, so you have to some manual color grading, the simple choice is just to bring up the exposure a little and if you turn on the viewer, you can look at luminance and you will see these images are at 100 nits but this looks too dark. So most people will bring it up to 1,000 nits which is going to look about right.
  6. There is a Reddit post that basically suggests that to make this work, you have to take all the non-HDR images and add the Color Effect HDR Tools and then choose PQ to HLG (Rec. 2100) to adjust the SDR clips to the correct HDR. What is going on here is that there are various HDR standards, the PQ apparently is close to the Bt. 709.
  7. But it appears that if you go to the lower right of the screen click on the film icon which is the Video Effects and then there is a search button then at the very bottom of the screen, click the Search and HDR tools. And then right-click on the HDR Tools and make it the default effect which basically enables Option-E to add it to every image so for each image, set the image transfer to PQ to HLQ (Rec. 2100) this will remap that SDR image into HDR.
  8. Actually, I found that the PQ to HLG trick brings it up a little and looks better, you can also go complete the “Martian effect” and adjust each manually, go to the color board and push the right-most button and push that up to 1000 nits and then push down the view to 1 nit. Most of the time this looks pretty good, but I actually find the PQ to HLG trick is more natural looking.
  9. Also, some cameras seem to just run really dark. We shot with iPhone 6, 6s, 11, 12 Pro and 13 Pro and the 6 and 6s seemed very dark, so it is a good time to use your artistic license to make it look closer.

Now to create things for Apple devices is pretty easy, you can

  1. Then to create the video, the File Share and then choose Export File and then Apple Device options seem very small. If you just choose File/Share and then the settings to Audio and Video the files are just huge.
  2. So you should do a File I found that File Share Export worked pretty well and is 10x faster than Compressor and generates files about the same, there is a dizzying number of video formats and you can also use a dedicated Compressor application as well. I found that the ordinary export worked pretty well when set to Apple Devices and the files are tiny like 1GB for 7 minutes, but when set to just Audio and Video they are huge.
  3. Make sure to set Final Cut set to tone mapped if you do not have an HDR monitor. I’m lucky, the MacBook Pro 2021 and the LG B6 are all HDR so I don’t need this.
  4. Apple Compressor basically does the HEVC encoding for you the main trick is that after you do a File / Send New Batch, you have to drag and drop the format you want in the compressor on the left pane onto the batch on the right. This was pretty confusing to me because if you don’t do this, then the compressor never operates even if you pick the start batch and when.

Step 5. Compress and Share Google Photos and 4K HDR YouTube wait for it!

Sharing with Google photos is pretty easy. You just go to Google Photos and choose Upload. You want to do this first to your core timeline and then create a Shared Album and import those. One complication is that Google supports 4K but not HDR on playing, so in addition, if you want HDR viewable on the web and you don’t want people to download it, you need to upload the HDR to YouTube which does support it.

Why do you care about HDR, well the big reason is that SDR has a brightness of at most 100 nits, but HDR can go up to 10,000 nits (really bright is the translation). Technically what is happening is that instead of 8-bit-per-color of SDR, but to 10-bits-per-color of HDR, lots more colors are available. The main complication is how to transcode the thing properly and it is definitely not easy to do:

  1. Ethan Mitchell, a Youtuber, likes PQ (instead of HLG) as the native format for editing and he wants to change all the videos to color override and change them all to PQ (Perceptual Quality)
  2. Command-7 color scopes and look at the luminance scope and set to Waveform and value in Nits. You want to be around 1,000 nits for most mobile displays and the shadows around 1. He doesn’t use the PQ to HLG trick but instead manages each image separately with the color correction, where he makes the highlights just at 1,000 nits and then lowers the shadows to 1 nit.
  3. Then you use the HEVC 10-bit and push it out. For me, I did not see the HDR coming out properly so this could be a settings problem (since I was using HLG).
  4. I also tried to use the YouTube suggestions, and switch to 10-bit HEVC and then this should just work, but it didn’t, I can get 4K, but not the HDR recognized immediately, but what I discovered was that after waiting a day, suddenly the “4K” video because “4K HDR” so it does work with some latency.

Step 6. Publish to Apple Photos and the order

The next thing to do is to publish this stuff into a shared album, the main complexity is that there is no way to force an order here, so you have to be careful and methodical:

  1. Usually, I like the videos first, so do a file import for the Apple Device as mentioned above and then you import these and then create the shared album. Then you should put into a comment for each video what they are. Because we are using HLG format this should look good as either SDR or HDR, so I just load a single set.
  2. Then you insert in chronological order the static photos because you cannot sort by date or anything it is sorted just by the time you inserted them

I’m Rich & Co.

Welcome to Tongfamily, our cozy corner of the internet dedicated to all things technology and interesting. Here, we invite you to join us on a journey of tips, tricks, and traps. Let’s get geeky!

Let’s connect