[ad_1]
The iPhone can now help complete the rack focus we’ve all been waiting for.
An important feature, Cinematic Mode, allows you to do many of the things described here.
Let’s dive in.
Cinematic mode
First of all, what is Cinematic mode? It is a mode that allows you to create an artificially shallow depth of field effect in your video, creating a focus between your subjects.
In general, smartphone videos, especially in bright environments such as daytime outdoor scenes, have a very deep depth of field due to the small sensor size. This means everything is in focus, taking away the cinematic ability to use focus to grab your attention.
Cinematic mode not only takes advantage of iPhone 13 Pro’s powerful processor, but also its LiDAR sensor, which can tell you the distance of an object from the camera. This is the same tool that enables “portrait mode” in still photos, but here it’s only used for videos. Unlike “artificial blur”, which is used to create depth of field with things like zooming, which blurs everything else evenly while preserving sharpness, Cinematic mode is more sophisticated. It can tell you how far away something is and use that depth information to apply more blur to objects that are further away, creating a more realistic sense of depth of field. .
Apple has been shown to have modeled the effect on movie images, real-world lens behavior, and all the test footage the company has released. Apple also uses some pretty advanced technology to analyze what’s in your shot and know where you want it to focus before you focus.
When a character turns away from the camera, cinematic mode should focus on what they’re looking at. If someone is trying to get into the frame, the LiDAR (which has a wide field of view) should sense it and rack it up before they get there, just like a professional AC.
Does iPhone Rack Focus work?
The point is, for the right kind of shot, this is kind of great.
If you stack several actors and have them look at each other, they will magically move back and forth between them depending on who is looking where. When you set up a street shot, it will oscillate between characters and once you settle on one, it will settle on them as well. It’s a bit like using the Sony A7SIII’s amazing autofocus, but at least he’s slightly better at one thing.
Of course there are flaws.
There were definitely shots that I thought were going to go into the rack easily, but they didn’t. Perhaps it was something about where people were in the frame. It seemed like you couldn’t catch people too close to the edge. But after trying again and changing the position a bit, it almost worked. It was very similar to quickly learning how to utilize the focus feature to properly frame the shot.
And in reality, this is no different than on a real set, where we repeatedly ask actors to walk through bananas to stay in frame, or turn slightly to catch the light. It felt like it made a lot of sense because it’s an amazingly powerful tool. We are constantly learning how to deal with the limitations of different tools. If there are a few framing and blocking tricks required to use this properly, it won’t take long to learn them.
The main limitation at the moment is when someone is looking at something other than a person. I couldn’t get it to the buildings or dogs in the background consistently. For narrative works, this is less of a problem since people primarily look at each other within the frame. But for travel videos, this would be frustrating. If you use cinematic mode for a selfie and want it to show when you’re looking at a mountain or temple behind you, it won’t. I suspect there will be things that improve over time, and even a “travel” mode and a “monument” mode.
“Racking against incoming people” is surprisingly effective. This is one of the things that technology is used to its fullest. Equipped with her LiDAR that looks around the image, she can know when someone is coming in and trying to rack them up before they even come in. It’s something that professional air conditioners do all the time, but frankly it’s dangerous to watch it on your phone. Pleasant surprise.
We tested it with both actors and also with people on the street and it was great. This is a great feature, better than almost any other autofocus you’ll use if you need someone to be in the frame and then the camera knows they’re important.
Another major limitation is when you have something very sharp against a very out-of-focus background, like a test shot of a hand taken with a 3x lens against an out-of-focus background. is. It’s too “extreme” for images to process.
Image comparison
Of course, this is artificially digital and “out of focus”, so you can easily shoot side-by-side with a 50mm prime lens with an ND2.1 filter and see that the “digital” bokeh is actually “bokeh”. I checked to see what it looks like. ”
As you can see, the “bokeh” itself is relatively good, but what’s really appealing about traditional lenses is the contrast between the bokeh and the bokeh.
50mm prime lens, ND 2.1 filter, T2, “real” bokeh.
iPhone 3x Cinematic Mode. Look at the rustling around your fingers.
Even at this wide angle, your fingers will get fringe.
This fringing may be due to the low resolution of LiDAR. Although the video is 4K, LiDAR’s resolution is likely much lower, so using it for such an extreme example (hand in a distant background) will introduce image artifacts due to heavy “cinematic” effects.
This is not a bar to negotiation. Remember, you can turn Cinematic mode on and off at any time after shooting. However, you should keep this in mind when planning your shots and thinking about when and where you can use this.
Edit after editing
One of the niftiest tricks is that you can change everything in a post by simply clicking “Edit” in your iPhone’s Photos app.
There are little focus points at the bottom that show you where the focus will go, and you can click on them to change settings for where the post focuses. You can also change the artificial “F-stop” used to create an artificial focal point. I’m happy.
At this time, there are no Post apps that support this data. When you airdrop a file, the file is “burned” and edits are locked before delivery.
However, it’s very likely, virtually guaranteed, that Final Cut Pro We are confident that there will be a way to export .
macro mode
This camera has surprisingly useful macro settings, but they work best when working in regular video mode rather than cinematic mode.
This is because the depth of field in macro images is already very shallow or close to it, so it may not be worth keeping it in cinematic mode as the LiDAR sensor may not be providing accurate information. .
There’s a noticeable “pop” when pushing from wide to macro, but that’s not surprising and isn’t a deal breaker. He only remembers one shot in his career so far where it was really important to go from macro to regular shot without a pop.
Macros are still very impressive, and while the uses for macro work in filmmaking are limited, they are powerful when needed, especially for transitions. It’s also a staple of document work, and shouldn’t be underestimated as a tool for piecing together edits.
Using third party apps
For more control over focus transition, we recommend using a third-party camera app. Apps like FiLMiC Pro and ProCamera offer advanced features like manual focus control to help you achieve a smoother, more accurate focus rack.
- Download third-party camera apps: Search for and download FiLMiC Pro, ProCamera, or other video-specific apps in the App Store.
- Open an app and select manual focus: These apps usually have a manual focus feature. Look for the focus slider or control within the app.
- Set Initial Focus: Use the manual focus control to set focus on the first subject.
- Record a video: Start recording a video within the app.
- Rack Focus: Slide the focus control smoothly from the first subject to the second subject while recording. This allows you to precisely control the timing and speed of focus transitions.
Tips for better results
- Use a tripod: To ensure video stability when changing focus, it’s helpful to mount your iPhone on a tripod.
- Practice moving: Moving the focus point smoothly takes practice, especially when using manual controls. Take some time to practice before shooting your final video.
- Consider your subject: The distance between the two focal points will affect how noticeable the focus shift will be. Position your subject to maximize this effect.
conclusion
Cinematic mode is definitely a big step forward for the iPhone, allowing you to capture “cinematic” images that can focus your audience’s attention, all with your phone’s camera.
Leveraging LiDAR to look for “digital defocus” effects can be surprisingly fun. We hope to see tools to edit footage for posting soon, as we like to see footage on a larger screen while making tweaks and adjustments. But overall, this is an absolutely remarkable tool. If you need to squeeze your camera into tight spaces, the iPhone 13 Pro continues to be the one to consider.
I’ve only had the camera for a few hours at this point, but here’s what I’ve learned so far. We’re going to continue filming, but there’s definitely a lot of interesting technology here for filmmakers to look at.
From an article on your site
Related articles on the web
[ad_2]
Source link